datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
suolyer/wudao | ---
license: apache-2.0
---
|
Jeffreyzhaoliang/vint-6d | ---
license: mit
---
This dataset is for VinT_Bench: Benchmarking the Object-in-hand Pose from Vision, Touch, and Proproception.
Senlin update the vint-sim, Zhaoliang update the vint-real |
valurank/News_Articles_Categorization | ---
license:
- other
language:
- en
multilinguality:
- monolingual
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for News_Articles_Categorization
## Table of Contents
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Source Data](#source-data)
## Dataset Description
3722 News Articles classified into different categories namely: World, Politics, Tech, Entertainment, Sport, Business, Health, and Science
## Languages
The text in the dataset is in English
## Dataset Structure
The dataset consists of two columns namely Text and Category.
The Text column consists of the news article and the Category column consists of the class each article belongs to
## Source Data
The dataset is scrapped across different news platforms
|
ccore/rhetoric-saint-thomas-aquinas | ---
license: mit
---
Whether God Is Composed of Matter and Form?
Objection 1: It seems that God is composed of matter and form. For
whatever has a soul is composed of matter and form; since the soul is
the form of the body. But Scripture attributes a soul to God; for it
is mentioned in Hebrews (Heb. 10:38), where God says: "But My just man
liveth by faith; but if he withdraw himself, he shall not please My
soul." Therefore God is composed of matter and form.
Objection 2: Further, anger, joy and the like are passions of the
composite. But these are attributed to God in Scripture: "The Lord was
exceeding angry with His people" (Ps. 105:40). Therefore God is
composed of matter and form.
Objection 3: Further, matter is the principle of individualization.
But God seems to be individual, for He cannot be predicated of many.
Therefore He is composed of matter and form.
Contrary: Whatever is composed of matter and form is a body;
for dimensive quantity is the first property of matter. But God is not
a body as proved in the preceding Article; therefore He is not
composed of matter and form.
Response: It is impossible that matter should exist in God.
First, because matter is in potentiality. But we have shown (Q. 2, A. 3)
that God is pure act, without any potentiality. Hence it is
impossible that God should be composed of matter and form. Secondly,
because everything composed of matter and form owes its perfection and
goodness to its form; therefore its goodness is participated, inasmuch
as matter participates the form. Now the first good and the
best--viz. God--is not a participated good, because the essential
good is prior to the participated good. Hence it is impossible that
God should be composed of matter and form. Thirdly, because every
agent acts by its form; hence the manner in which it has its form is
the manner in which it is an agent. Therefore whatever is primarily
and essentially an agent must be primarily and essentially form. Now
God is the first agent, since He is the first efficient cause. He is
therefore of His essence a form; and not composed of matter and form.
Reply Objection 1: A soul is attributed to God because His acts
resemble the acts of a soul; for, that we will anything, is due to our
soul. Hence what is pleasing to His will is said to be pleasing to His
soul.
Reply Objection 2: Anger and the like are attributed to God on
account of a similitude of effect. Thus, because to punish is properly
the act of an angry man, God's punishment is metaphorically spoken of
as His anger.
Reply Objection 3: Forms which can be received in matter are
individualized by matter, which cannot be in another as in a subject
since it is the first underlying subject; although form of itself,
unless something else prevents it, can be received by many. But that
form which cannot be received in matter, but is self-subsisting, is
individualized precisely because it cannot be received in a subject;
and such a form is God. Hence it does not follow that matter exists in
God.
_______________________ |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_dev_cot-mathemak-6b9a5d-1879664171 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_dev_cot
eval_info:
task: text_zero_shot_classification
model: ArthurZ/opt-350m
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_dev_cot
dataset_config: mathemakitten--winobias_antistereotype_dev_cot
dataset_split: validation
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: ArthurZ/opt-350m
* Dataset: mathemakitten/winobias_antistereotype_dev_cot
* Config: mathemakitten--winobias_antistereotype_dev_cot
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_CM_D_PNP_GENERIC_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 12314233
num_examples: 1000
download_size: 2135686
dataset_size: 12314233
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_CM_D_PNP_GENERIC_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lewtun/benchmarks-gem-submission | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name |
harsh13333/shipping_label_ner | ---
license: afl-3.0
---
|
DynamicSuperb/VoiceConversion_VCTK | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: source_speech_id
dtype: string
- name: source_speech
dtype:
audio:
sampling_rate: 48000
- name: source_transcription
dtype: string
- name: target_speech_id
dtype: string
- name: target_speech
dtype:
audio:
sampling_rate: 48000
- name: target_transcription
dtype: string
- name: label_id
dtype: string
- name: label
dtype:
audio:
sampling_rate: 48000
- name: label_transcription
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 3132068107.564
num_examples: 2001
download_size: 2043675326
dataset_size: 3132068107.564
---
# Dataset Card for "VoiceConversion_VCTK"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1 | ---
pretty_name: Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/llama-13b-pretrained-sft-epoch-1](https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T22:06:45.407147](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1/blob/main/results_2023-10-18T22-06-45.407147.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2225251677852349,\n\
\ \"em_stderr\": 0.004259635026591598,\n \"f1\": 0.287082634228188,\n\
\ \"f1_stderr\": 0.004255345667621572,\n \"acc\": 0.45729496587127727,\n\
\ \"acc_stderr\": 0.01062102533078612\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2225251677852349,\n \"em_stderr\": 0.004259635026591598,\n\
\ \"f1\": 0.287082634228188,\n \"f1_stderr\": 0.004255345667621572\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13874147081122062,\n \
\ \"acc_stderr\": 0.009521649920798148\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774092\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T22_06_45.407147
path:
- '**/details_harness|drop|3_2023-10-18T22-06-45.407147.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T22-06-45.407147.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T22_06_45.407147
path:
- '**/details_harness|gsm8k|5_2023-10-18T22-06-45.407147.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T22-06-45.407147.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:41:46.574881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:41:46.574881.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T22_06_45.407147
path:
- '**/details_harness|winogrande|5_2023-10-18T22-06-45.407147.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T22-06-45.407147.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_41_46.574881
path:
- results_2023-07-19T19:41:46.574881.parquet
- split: 2023_10_18T22_06_45.407147
path:
- results_2023-10-18T22-06-45.407147.parquet
- split: latest
path:
- results_2023-10-18T22-06-45.407147.parquet
---
# Dataset Card for Evaluation run of dvruette/llama-13b-pretrained-sft-epoch-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/llama-13b-pretrained-sft-epoch-1](https://huggingface.co/dvruette/llama-13b-pretrained-sft-epoch-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T22:06:45.407147](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__llama-13b-pretrained-sft-epoch-1/blob/main/results_2023-10-18T22-06-45.407147.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2225251677852349,
"em_stderr": 0.004259635026591598,
"f1": 0.287082634228188,
"f1_stderr": 0.004255345667621572,
"acc": 0.45729496587127727,
"acc_stderr": 0.01062102533078612
},
"harness|drop|3": {
"em": 0.2225251677852349,
"em_stderr": 0.004259635026591598,
"f1": 0.287082634228188,
"f1_stderr": 0.004255345667621572
},
"harness|gsm8k|5": {
"acc": 0.13874147081122062,
"acc_stderr": 0.009521649920798148
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774092
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SHS/newest_biored | ---
dataset_info:
features:
- name: pmid
dtype: string
- name: passage
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: test
num_bytes: 576610
num_examples: 97
- name: train
num_bytes: 2259680
num_examples: 387
- name: val
num_bytes: 604670
num_examples: 98
download_size: 1083243
dataset_size: 3440960
---
# Dataset Card for "newest_biored"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kjj0/cifar10-multirun-logits | ---
license: mit
---
# A kernel function which improves the accuracy and interpretability of large ensembles of neural networks
We describe a new kernel (i.e. similarity function between pairs of examples) which is computed using an ensemble of neural networks. It has the following properties:
- Using it to predict test labels (via k-nearest neighbors across the training set) yields even higher accuracy than the standard ensemble inference method
of averaging predictions, once the number of networks exceeds about 100. We believe this kernel + k-NN method is the state-of-the-art for inferencing large ensembles
(although such ensembles are rarely used in practice).
- Being a similarity function, it is highly interpretable. For each test example, it allows us to visualize training examples which are deemed to have
similar features by the training process, with much greater fidelity than e.g. penultimate layer embeddings. For instance, we use this to identify the (known) fact that
~10% of the CIFAR-10 test-set examples have a near-duplicate in the training set, and to identify a failure mode.
To compute the kernel for an ensemble of n=500 models, we provide the following simple code (which can be copy-paste run in your environment).
```
import torch
import torchvision
import huggingface_hub
def normalize(logits):
logits = logits.float()
logits = logits.log_softmax(-1)
logits = (logits - logits.mean(0, keepdim=True)) / logits.std(0, keepdim=True)
return logits
def compute_kernel(logits1, logits2):
logits1 = normalize(logits1)
logits2 = normalize(logits2)
assert len(logits1) == len(logits2)
kernel = torch.zeros(logits1.shape[1], logits2.shape[1]).cuda()
for c in range(10):
logits1_cls = logits1[..., c].cuda()
logits2_cls = logits2[..., c].cuda()
corr_cls = (logits1_cls.T @ logits2_cls) / len(logits1)
kernel += corr_cls / 10
return kernel
######################################################################################
# Setup: Download CIFAR-10 labels and the outputs from 500 repeated training runs. #
######################################################################################
labels_train = torch.tensor(torchvision.datasets.CIFAR10('cifar10', train=True).targets)
labels_test = torch.tensor(torchvision.datasets.CIFAR10('cifar10', train=False).targets)
api = huggingface_hub.HfApi()
fname = 'logs_saveoutputs_main/06109e85-f5d7-4ac8-b0b0-f03542f23234/log.pt'
obj_path = api.hf_hub_download('kjj0/cifar10-multirun-logits', repo_type='dataset',
filename=fname)
obj = torch.load(obj_path, map_location='cpu')
# print(obj['code']) # Uncomment if you want to see the training code
######################################################################################
# Evaluate both the per-model and ensembled accuracy of the training outputs. #
######################################################################################
each_acc = (obj['logits'].argmax(-1) == labels_test).float().mean(1)
avg_acc = each_acc.mean()
print('average single-model accuracy \t: %.2f' % (100 * avg_acc))
ens_pred = obj['logits'].mean(0).argmax(1)
ens_acc = (ens_pred == labels_test).float().mean()
print('ensemble accuracy (%d models) \t: %.2f' % (len(obj['logits']), 100 * ens_acc))
# (n.b. averaging probabilities instead of logits makes no difference)
######################################################################################
# Evaluate the new kernel / ensemble inference method. #
######################################################################################
# use correlations between log_softmax outputs as a similarity metric for k-NN inference.
kernel = compute_kernel(obj['logits'], obj['logits_train'])
k = 3
nbrs = kernel.topk(k, dim=1)
nbr_labels = labels_train[nbrs.indices.cpu()]
pred = nbr_labels.mode(1).values
acc = (pred == labels_test).float().mean()
print('kernel accuracy (k-NN w/ k=%d) \t: %.2f' % (k, 100 * acc))
## average single-model accuracy : 93.26
## ensemble accuracy (500 models) : 94.69
## kernel accuracy (k-NN w/ k=3) : 95.01
```
The training configuration we used to generate these 500 models (i.e. the script that we re-ran 500 times with different random seeds) yields a mean accuracy of 93.26%.
If we average the predictions across those 500 models, we attain a much improved accuracy of 94.69%.
If we predict the test-set labels using our kernel applied to pairs of (train, test) examples, using k-nearest neighbors with k=3,
then we attain an even higher accuracy of 95.01%.
We include 20,000 total runs of training for the same training configuration that generated the 500 runs used in the above.
The outputs of those runs (i.e. the logits predicted by the final model on the training and test examples) can be found as the other files in `logs_saveoutputs_main`.
If we compute the kernel with all 20,000 runs instead of 500, and use a weighting scheme based on the correlation values,
then the accuracy can be futher increased to 95.53%.
Note that increasing from 500 to 20,000 does not improve the accuracy of the averaged predictions,
so with 95.53% we have reached 0.84% higher than the standard ensemble accuracy.
We additionally include outputs from three other training configurations; their kernels seem to have the same properties.
## Interpretability-type applications
### Finding similar pairs
(Below:) We rank the CIFAR-10 test-set examples by their similarity to their most similar training-set example.
We show the 601th-648th most highly ranked test examples (out of 10,000), along with their matched training examples.
Many of them turn out to be visually similar pairs.

We note that the penultimate-layer features almost entirely lack this property --
if we visualize the most similar pairs across all (test, train) pairs according to distance in penultimate feature space,
we will get not duplicates but instead just random highly confident examples which have all presumably collapsed to a similar point in space.
On the other hand, pairs which are given a high similarity score by our correlation kernel turn out to often be near-duplicates, and this holds true
for the most similar pairs even when we reduce the number of models in the ensemble down to a relatively small value like 10 or 20.
### Diagnosing failure modes
(Below:) We rank the CIFAR-10 test examples by how similar their most similar training-set example is, and then filter for cases where they have different labels.
The first (leftmost) column contains the top 8 such test examples, and then subsequent columns are their 9 nearest neighbors in the training set.
It appears that our network has difficulty seeing small objects.

### Some random examples
(Below:) We select 10 CIFAR-10 test examples at random (the first row), and display their two nearest neighbors according to the kernel (second two rows),
and the penultimate features from a single model (next two rows). The kernel yields images which are perceptually similar, whereas penultimate features
select nearly a random image of the same label.

## Open questions
* The usage of `log_softmax` in the normalization step seems to be important, especially for making the kernel work with n < 1,000 (where n is the number of networks).
But for n -> infty, it becomes less important. Why -- is it somehow removing noise?
* Via the Neural Network Gaussian Process (NNGP) theory, it is possible to compute the expectation of this kernel for untrained / newly initialized networks
(at least if the log-softmax is removed). Is there any general theory for what this kernel becomes after training (i.e., what we are seeing here)?
* This kernel is implemented as a sum of 10 correlation kernels -- one for each class. But upon inspection, each of those has dramatically worse
k-NN accuracy than their sum, at least until n becomes on the order of thousands. Why?
* Removing log-softmax, despite harming the overall accuracy as discussed earlier,
apparently increases the k-NN accuracy (and generally quality) of the individual kernels. Why??
* How does this kernel compare to [TRAK](https://arxiv.org/abs/2303.14186)
or the datamodel embeddings from [https://arxiv.org/abs/2202.00622](https://arxiv.org/abs/2202.00622)?
|
sinhala-nlp/NSINA-Categories | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- si
---
# Sinhala News Category Prediction
This is a text classification task created with the [NSINA dataset](https://github.com/Sinhala-NLP/NSINA). This dataset is also released with the same license as NSINA.
## Data
Data can be loaded into pandas dataframes using the following code.
```python
from datasets import Dataset
from datasets import load_dataset
train = Dataset.to_pandas(load_dataset('sinhala-nlp/NSINA-Categories', split='train'))
test = Dataset.to_pandas(load_dataset('sinhala-nlp/NSINA-Categories', split='test'))
```
## Citation
If you are using the dataset or the models, please cite the following paper.
~~~
@inproceedings{Nsina2024,
author={Hettiarachchi, Hansi and Premasiri, Damith and Uyangodage, Lasitha and Ranasinghe, Tharindu},
title={{NSINA: A News Corpus for Sinhala}},
booktitle={The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)},
year={2024},
month={May},
}
~~~
|
PORTULAN/parlamento-pt | ---
annotations_creators:
- no-annotation
language:
- pt
license:
- other
multilinguality:
- monolingual
pretty_name: ParlamentoPT
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
tags:
- parlamentopt
- parlamento
- parlamento-pt
- albertina-pt*
- albertina-ptpt
- albertina-ptbr
- fill-mask
- bert
- deberta
- portuguese
- encoder
- foundation model
---
# Dataset Card for ParlamentoPT
### Dataset Summary
The ParlamentoPT is a **Portuguese** language data set obtained by collecting publicly available documents containing transcriptions of debates in the Portuguese Parliament.
The data was collected from the Portuguese Parliament portal in accordance with its [open data policy](https://www.parlamento.pt/Cidadania/Paginas/DadosAbertos.aspx).
This dataset was collected with the purpose of creating the [Albertina-PT*](https://huggingface.co/PORTULAN/albertina-ptpt) language model, and it serves as training data for model development.
The development of the model is a collaborative effort between the University of Lisbon and the University of Porto in Portugal
</br>
# Citation
When using or citing this data set, kindly cite the following [publication](https://arxiv.org/abs/2305.06721):
``` latex
@misc{albertina-pt,
title={Advancing Neural Encoding of Portuguese
with Transformer Albertina PT-*},
author={João Rodrigues and Luís Gomes and João Silva and
António Branco and Rodrigo Santos and
Henrique Lopes Cardoso and Tomás Osório},
year={2023},
eprint={2305.06721},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<br>
# Acknowledgments
The research reported here was partially supported by: PORTULAN CLARIN—Research Infrastructure for the Science and Technology of Language,
funded by Lisboa 2020, Alentejo 2020 and FCT—Fundação para a Ciência e Tecnologia under the
grant PINFRA/22117/2016; research project ALBERTINA - Foundation Encoder Model for Portuguese and AI, funded by FCT—Fundação para a Ciência e Tecnologia under the
grant CPCA-IAC/AV/478394/2022; innovation project ACCELERAT.AI - Multilingual Intelligent Contact Centers, funded by IAPMEI, I.P. - Agência para a Competitividade e Inovação under the grant C625734525-00462629, of Plano de Recuperação e Resiliência, call RE-C05-i01.01 – Agendas/Alianças Mobilizadoras para a Reindustrialização; and LIACC - Laboratory for AI and Computer Science, funded by FCT—Fundação para a Ciência e Tecnologia under the grant FCT/UID/CEC/0027/2020. |
fmattera/test_data2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: prompt
sequence: string
splits:
- name: train
num_bytes: 3854203.0
num_examples: 4
download_size: 3857683
dataset_size: 3854203.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_data2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suke-sho/plant-genome-corpus | ---
license: mit
---
# Plant Genome Corpus
***
## About
This corpus consists of plant genomes from various species, including Arabidopsis thaliana, Solanum lycopersicum, Oryza sativa, Zea mays, Sorghum bicolor, and Glycine max.
The genomic data are sourced from reputable databases such as NCBI and Ensemble.
This diverse and comprehensive dataset is suitable for pre-training models aimed at understanding and interpreting plant genomic information
## Contents (plant-genome-corpus)
|Species|Source|Version|
|:---:|:---:|:---:|
|Arabidopsis thaliana|NCBI|TAIR10|
|Solanum lycopersicum|NCBI|SL3.1|
|Oryza sativa|Ensemble|IRGSP-1.0|
|Zea mays|Ensemble|AGPv3|
|Sorghum_bicolor|Ensemble|Sbi1|
|Glycine_max|Ensemble|Gm01|
## Contents (plant-genome-multi-versions-corpus)
| Species | Source | Version |
|:---:|:---:|:---:|
| Arabidopsis thaliana | NCBI | build9.1 |
| Arabidopsis thaliana | NCBI | TAIR10 |
| Arabidopsis thaliana | Ensemble | TAIR9 |
| Oryza sativa | Ensemble | IRGSP-1.0 |
| Oryza sativa | Ensemble | MSU6 |
| Zea mays | Ensemble | AGPv2 |
| Zea mays | Ensemble | AGPv3 |
| Sorghum_bicolor | Ensemble | Sbi1 |
| Glycine_max | Ensemble | Gm01 |
| Solanum lycopersicum | NCBI | SL3.1 | |
gathnex/Gath_baize | ---
license: mit
---
|
davanstrien/art_private | Invalid username or password. |
logh/myself | ---
license: unknown
---
|
kaleinaNyan/wmt19_ru-en | ---
language:
- ru
- en
--- |
DigitalUmuganda/Monolingual_health_dataset | ---
license: cc-by-2.0
language:
- rw
- en
size_categories:
- 10K<n<100K
---
# Monolingual Dataset
This a a malnutrition dataset in Kinyarwanda and English, it shall be translated using translators to make it a parallel corpus.
# Source of Data
1. Rwanda Biomedical Center (RBC) (26,390 sentences)
2. GPT-4 prompting (42,576 sentences) |
erkam/clevr-full-v5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: depth
dtype: image
- name: layout
dtype: image
- name: colored_layout
dtype: image
- name: objects
sequence: int64
- name: boxes
sequence:
sequence: float32
- name: triplets
sequence:
sequence: int64
- name: objects_str
dtype: string
splits:
- name: train
num_bytes: 72217786.0
num_examples: 960
- name: val
num_bytes: 8935628.0
num_examples: 119
- name: test
num_bytes: 8912087.0
num_examples: 119
download_size: 88745185
dataset_size: 90065501.0
---
# Dataset Card for "clevr-full-v5"
25 objects with 4 spatial relationships
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maulinnasari/dataset_ext_20_mn_ns | ---
dataset_info:
features:
- name: document
sequence: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 160065061
num_examples: 44972
- name: validation
num_bytes: 19636553
num_examples: 5622
- name: test
num_bytes: 19797897
num_examples: 5622
download_size: 124873547
dataset_size: 199499511
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
bjoernp/the-stack-dedup-python-deu_Latn | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 267637689.56000614
num_examples: 48262
download_size: 90252233
dataset_size: 267637689.56000614
---
# Dataset Card for "the-stack-dedup-python-deu_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel1322/freefire | ---
license: openrail
---
|
genesisqu/fake-real-news | ---
license: bsd
---
|
HydraLM/partitioned_v3_standardized_021 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 40218541.218423784
num_examples: 74795
download_size: 8276625
dataset_size: 40218541.218423784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_021"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai4privacy/pii-masking-43k | ---
language:
- en
tags:
- legal
- business
- psychology
- privacy
size_categories:
- 10K<n<100K
---
# Purpose and Features
The purpose of the model and dataset is to remove personally identifiable information (PII) from text, especially in the context of AI assistants and LLMs.
The model is a fine-tuned version of "Distilled BERT", a smaller and faster version of BERT. It was adapted for the task of token classification based on the largest to our knowledge open-source PII masking dataset, which we are releasing simultaneously. The model size is 62 million parameters. The original encoding of the parameters yields a model size of 268 MB, which is compressed to 43MB after parameter quantization. The models are available in PyTorch, tensorflow, and tensorflow.js
The dataset is composed of ~43’000 observations. Each row starts with a natural language sentence that includes placeholders for PII and could plausibly be written to an AI assistant. The placeholders are then filled in with mocked personal information and tokenized with the BERT tokenizer. We label the tokens that correspond to PII, serving as the ground truth to train our model.
The dataset covers a range of contexts in which PII can appear. The sentences span 54 sensitive data types (~111 token classes), targeting 125 discussion subjects / use cases split across business, psychology and legal fields, and 5 interactions styles (e.g. casual conversation vs formal document).
Key facts:
- Currently 5.6m tokens with 43k PII examples.
- Scaling to 100k examples
- Human-in-the-loop validated
- Synthetic data generated using proprietary algorithms
- Adapted from DistilBertForTokenClassification
- Framework PyTorch
- 8 bit quantization
# Performance evaluation
| Test Precision | Test Recall | Test Accuracy |
|:-:|:-:|:-:|
| 0.998636 | 0.998945 | 0.994621 |
Training/Test Set split:
- 4300 Testing Examples (10%)
- 38700 Train Examples
# Community Engagement:
Newsletter & updates: www.Ai4privacy.com
- Looking for ML engineers, developers, beta-testers, human in the loop validators (all languages)
- Integrations with already existing open source solutions
# Roadmap and Future Development
- Multilingual
- Extended integrations
- Continuously increase the training set
- Further optimisation to the model to reduce size and increase generalisability
- Next released major update is planned for the 14th of July (subscribe to newsletter for updates)
# Use Cases and Applications
**Chatbots**: Incorporating a PII masking model into chatbot systems can ensure the privacy and security of user conversations by automatically redacting sensitive information such as names, addresses, phone numbers, and email addresses.
**Customer Support Systems**: When interacting with customers through support tickets or live chats, masking PII can help protect sensitive customer data, enabling support agents to handle inquiries without the risk of exposing personal information.
**Email Filtering**: Email providers can utilize a PII masking model to automatically detect and redact PII from incoming and outgoing emails, reducing the chances of accidental disclosure of sensitive information.
**Data Anonymization**: Organizations dealing with large datasets containing PII, such as medical or financial records, can leverage a PII masking model to anonymize the data before sharing it for research, analysis, or collaboration purposes.
**Social Media Platforms**: Integrating PII masking capabilities into social media platforms can help users protect their personal information from unauthorized access, ensuring a safer online environment.
**Content Moderation**: PII masking can assist content moderation systems in automatically detecting and blurring or redacting sensitive information in user-generated content, preventing the accidental sharing of personal details.
**Online Forms**: Web applications that collect user data through online forms, such as registration forms or surveys, can employ a PII masking model to anonymize or mask the collected information in real-time, enhancing privacy and data protection.
**Collaborative Document Editing**: Collaboration platforms and document editing tools can use a PII masking model to automatically mask or redact sensitive information when multiple users are working on shared documents.
**Research and Data Sharing**: Researchers and institutions can leverage a PII masking model to ensure privacy and confidentiality when sharing datasets for collaboration, analysis, or publication purposes, reducing the risk of data breaches or identity theft.
**Content Generation**: Content generation systems, such as article generators or language models, can benefit from PII masking to automatically mask or generate fictional PII when creating sample texts or examples, safeguarding the privacy of individuals.
(...and whatever else your creative mind can think of)
# Support and Maintenance
AI4Privacy is a project affiliated with [AISuisse SA](https://www.aisuisse.com/). |
patruff/oai-style-chuckles2 | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 110995
num_examples: 605
- name: test
num_bytes: 27923
num_examples: 152
download_size: 27330
dataset_size: 138918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
setiadi01/test-lawyer | ---
license: openrail
language:
- en
size_categories:
- n<1K
--- |
CyberHarem/hori_yuuko_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hori_yuuko/堀裕子/호리유코 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hori_yuuko/堀裕子/호리유코 (THE iDOLM@STER: Cinderella Girls), containing 245 images and their tags.
The core tags of this character are `brown_hair, ponytail, red_eyes, bow, breasts, hair_bow, bangs, brown_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 245 | 242.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hori_yuuko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 245 | 146.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hori_yuuko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 542 | 300.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hori_yuuko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 245 | 215.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hori_yuuko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 542 | 420.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hori_yuuko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hori_yuuko_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blush, bare_shoulders, simple_background, solo, white_background, collarbone, hair_scrunchie, off_shoulder, upper_body, hair_between_eyes, high_ponytail, looking_at_viewer, long_sleeves, smile, sweater, necklace, sidelocks, holding_spoon, open_mouth, shirt, long_hair |
| 1 | 6 |  |  |  |  |  | 1girl, skirt, solo, necklace, open_mouth, :d, looking_at_viewer, spoon, thighhighs, blush, bracelet, scrunchie |
| 2 | 5 |  |  |  |  |  | 1girl, school_uniform, blush, hoodie, smile, solo, spoon, looking_at_viewer, open_mouth, plaid_skirt, school_bag |
| 3 | 5 |  |  |  |  |  | blush, plaid_skirt, pleated_skirt, school_uniform, 1girl, blue_skirt, bowtie, looking_at_viewer, red_bow, sidelocks, solo_focus, white_shirt, 1boy, :d, cowboy_shot, hair_between_eyes, hood_down, hooded_jacket, miniskirt, open_mouth, striped_bow, white_background, 2girls, high_ponytail, out_of_frame, pink_bow, simple_background, sweatdrop, v-shaped_eyebrows, yellow_hoodie |
| 4 | 10 |  |  |  |  |  | 1girl, navel, solo, belt, midriff, looking_at_viewer, smile, earrings, open_mouth, thigh_strap, cleavage, jacket, long_hair, short_shorts, black_gloves, blush, chain |
| 5 | 14 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, bar_censor, fellatio, nude, simple_background, hair_between_eyes, sweat, high_ponytail, nose_blush, white_background, cum, heart, nipples |
| 6 | 6 |  |  |  |  |  | 1girl, blush, floral_print, looking_at_viewer, solo, yukata, candy_apple, holding_food, outdoors, pink_bow, print_kimono, smile, fireworks, hair_ornament, obi, open_mouth, strawberry, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | bare_shoulders | simple_background | solo | white_background | collarbone | hair_scrunchie | off_shoulder | upper_body | hair_between_eyes | high_ponytail | looking_at_viewer | long_sleeves | smile | sweater | necklace | sidelocks | holding_spoon | open_mouth | shirt | long_hair | skirt | :d | spoon | thighhighs | bracelet | scrunchie | school_uniform | hoodie | plaid_skirt | school_bag | pleated_skirt | blue_skirt | bowtie | red_bow | solo_focus | white_shirt | 1boy | cowboy_shot | hood_down | hooded_jacket | miniskirt | striped_bow | 2girls | out_of_frame | pink_bow | sweatdrop | v-shaped_eyebrows | yellow_hoodie | navel | belt | midriff | earrings | thigh_strap | cleavage | jacket | short_shorts | black_gloves | chain | hetero | penis | bar_censor | fellatio | nude | sweat | nose_blush | cum | heart | nipples | floral_print | yukata | candy_apple | holding_food | outdoors | print_kimono | fireworks | hair_ornament | obi | strawberry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:--------------------|:-------|:-------------------|:-------------|:-----------------|:---------------|:-------------|:--------------------|:----------------|:--------------------|:---------------|:--------|:----------|:-----------|:------------|:----------------|:-------------|:--------|:------------|:--------|:-----|:--------|:-------------|:-----------|:------------|:-----------------|:---------|:--------------|:-------------|:----------------|:-------------|:---------|:----------|:-------------|:--------------|:-------|:--------------|:------------|:----------------|:------------|:--------------|:---------|:---------------|:-----------|:------------|:--------------------|:----------------|:--------|:-------|:----------|:-----------|:--------------|:-----------|:---------|:---------------|:---------------|:--------|:---------|:--------|:-------------|:-----------|:-------|:--------|:-------------|:------|:--------|:----------|:---------------|:---------|:--------------|:---------------|:-----------|:---------------|:------------|:----------------|:------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | X | | | | | | | | X | | | | X | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | | | | | | | X | | X | | | | | X | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | X | X | X | | | | | X | | X | | | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | | X | | | | | | | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | | | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | | X | | | | | X | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/find_last_sent_train_100_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 310819
num_examples: 240
- name: validation
num_bytes: 39780
num_examples: 40
download_size: 0
dataset_size: 350599
---
# Dataset Card for "find_last_sent_train_100_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713220928 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 158060
num_examples: 411
download_size: 85164
dataset_size: 158060
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-project-ml6team__cnn_dailymail_nl-7b67cb71-1286049228 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ml6team/cnn_dailymail_nl
eval_info:
task: summarization
model: yhavinga/t5-v1.1-large-dutch-cnn-test
metrics: []
dataset_name: ml6team/cnn_dailymail_nl
dataset_config: default
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: yhavinga/t5-v1.1-large-dutch-cnn-test
* Dataset: ml6team/cnn_dailymail_nl
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@yhavinga](https://huggingface.co/yhavinga) for evaluating this model. |
weijie210/UFB_reference_rejected | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_prefs
num_bytes: 214848706
num_examples: 55762
- name: test_prefs
num_bytes: 7077887
num_examples: 1843
download_size: 113783797
dataset_size: 221926593
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
banghua/random_pre | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: model
dtype: string
- name: rank
dtype: float64
- name: turns
dtype: int64
- name: num_responses
dtype: int64
- name: source
sequence: string
splits:
- name: train
num_bytes: 1206940856
num_examples: 182968
download_size: 551450326
dataset_size: 1206940856
---
# Dataset Card for "random_pre"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JamesNetflix/clothing | ---
dataset_info:
features:
- name: split
dtype: string
- name: label
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 4862406.0
num_examples: 44
download_size: 4863831
dataset_size: 4862406.0
---
# Dataset Card for "clothing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anzorq/kbd_lat-835k_ru-3M | ---
license: unknown
---
Kbd latin script: 835k lines from a scraped pile
ru: 3M lines from Wiki (OPUS) |
Rimyy/problemMath-Gemma3.5k | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 2710768
num_examples: 3500
download_size: 1273865
dataset_size: 2710768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_rte_for_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 487948
num_examples: 1144
- name: train
num_bytes: 453999
num_examples: 1028
download_size: 610018
dataset_size: 941947
---
# Dataset Card for "MULTI_VALUE_rte_for_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WforGodot/Addition | ---
license: mit
---
|
nayohan/fms-bench-raw | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: index
dtype: int64
- name: dataID
dtype: string
- name: relationship
dtype: string
- name: time_interval
sequence: string
- name: summary
sequence: string
- name: first_session_dialogue
sequence: string
- name: first_session_speakers
sequence: string
- name: second_session_dialogue
sequence: string
- name: second_session_speakers
sequence: string
- name: third_session_dialogue
sequence: string
- name: third_session_speakers
sequence: string
- name: fourth_session_dialogue
sequence: string
- name: fourth_session_speakers
sequence: string
- name: fifth_session_dialogue
sequence: string
- name: fifth_session_speakers
sequence: string
- name: eval_indicator
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 661372
num_examples: 80
download_size: 352262
dataset_size: 661372
---
# Dataset Card for "fms-bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-6c534f-38130145044 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: yuvraj/summarizer-cnndm
metrics: ['rouge', 'accuracy', 'bleu']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: yuvraj/summarizer-cnndm
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@https://huggingface.co/Sini](https://huggingface.co/https://huggingface.co/Sini) for evaluating this model. |
ashhadahsan/amazon_theme | ---
dataset_info:
features:
- name: Transcript
dtype: string
- name: Review Theme
dtype: string
splits:
- name: train
num_bytes: 347105
num_examples: 943
download_size: 208574
dataset_size: 347105
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "amazon_theme"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Saxo/ko_summarization_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
yzhuang/autotree_snnxor_n15_l1_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
- name: test
num_bytes: 236440000
num_examples: 10000
download_size: 432260994
dataset_size: 709320000
---
# Dataset Card for "autotree_snnxor_n15_l1_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aneeth/job_training_20K_samples | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: user_prompt
dtype: string
- name: model_response
dtype: string
splits:
- name: train
num_bytes: 36443200
num_examples: 20000
- name: validation
num_bytes: 1836052
num_examples: 1000
download_size: 9692443
dataset_size: 38279252
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
giux78/functioncalling-ita | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 214403032
num_examples: 112960
download_size: 89302942
dataset_size: 214403032
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/cluster09_large_150 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': 000190
'1': '000203'
'2': '000211'
'3': '000621'
'4': '003270'
'5': '003535'
'6': 003908
'7': 003909
'8': 005159
'9': '005170'
'10': 009887
'11': '012052'
'12': 012394
'13': 012508
'14': 016158
'15': 018877
'16': 019708
'17': '022474'
'18': 022481
'19': 029971
'20': 031391
'21': '036144'
'22': 038822
'23': 038823
'24': 039900
'25': 043962
'26': 045519
'27': '045520'
'28': 049848
'29': 051278
'30': 051291
'31': '051776'
'32': 052947
'33': 052950
'34': '053576'
'35': '054062'
'36': '054064'
'37': '056517'
'38': '065037'
'39': 066187
'40': '067764'
'41': '067765'
'42': 068838
'43': 068840
'44': 068843
'45': 068853
'46': 068854
'47': 068860
'48': 068862
'49': 069787
'50': '072562'
'51': '072565'
'52': '072570'
'53': '072605'
'54': '072607'
'55': '072612'
'56': '073174'
'57': '073572'
'58': '073573'
'59': '074372'
'60': '074546'
'61': '075435'
'62': 078213
'63': 079985
'64': 079986
'65': 080696
'66': 082505
'67': 082915
'68': 082920
'69': 084157
'70': 085436
'71': 085438
'72': 085692
'73': 085693
'74': 085951
'75': 087099
'76': 087189
'77': 091933
'78': 091958
'79': 092874
'80': 095722
'81': 096728
'82': 096729
'83': 096730
'84': '105712'
'85': '105715'
'86': '105716'
'87': '105718'
'88': '105914'
'89': '105915'
'90': '105918'
'91': '106343'
'92': '107583'
'93': '107591'
'94': '108863'
'95': '110384'
'96': '111182'
'97': '113343'
'98': '114879'
'99': '115263'
'100': '115267'
'101': '115268'
'102': '115774'
'103': '115775'
'104': '115817'
'105': '115944'
'106': '115948'
'107': '116175'
'108': '116451'
'109': '116704'
'110': '116874'
'111': '118670'
'112': '118672'
'113': '119828'
'114': '119831'
'115': '120771'
'116': '121656'
'117': '121658'
'118': '122362'
'119': '122622'
'120': '122623'
'121': '124177'
'122': '124424'
'123': '126362'
'124': '126405'
'125': '126607'
'126': '126676'
'127': '126746'
'128': '127265'
'129': '128880'
'130': '128882'
'131': '129439'
'132': '129675'
'133': '131900'
'134': '131904'
'135': '132045'
'136': '132310'
'137': '134791'
'138': '134793'
'139': '136134'
'140': '136324'
'141': '138061'
'142': '139520'
'143': '139522'
'144': '140873'
'145': '142516'
'146': '142529'
'147': '142530'
'148': '143218'
'149': '143941'
'150': '145554'
'151': '145556'
'152': '145777'
'153': '148190'
'154': '148215'
'155': '152568'
splits:
- name: train
num_bytes: 1330348523.4
num_examples: 23400
download_size: 1301547352
dataset_size: 1330348523.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/fukiyose_seiri_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fukiyose Seiri
This is the dataset of Fukiyose Seiri, containing 96 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 96 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 224 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 96 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 96 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 96 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 96 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 96 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 224 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 224 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 224 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Ialoris/test2 | ---
license: mit
---
|
Markmus/amazon-shoe-reviews | ---
dataset_info:
features:
- name: labels
dtype: int64
- name: text
dtype: string
splits:
- name: test
num_bytes: 1871962.8
num_examples: 10000
- name: train
num_bytes: 16847665.2
num_examples: 90000
download_size: 10939033
dataset_size: 18719628.0
---
# Dataset Card for "amazon-shoe-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tverous/anli-amr-amrlib | ---
dataset_info:
features:
- name: uid
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: reason
dtype: string
- name: linearized_amr
dtype: string
splits:
- name: train
num_bytes: 60139915
num_examples: 100459
- name: dev
num_bytes: 853527
num_examples: 1200
- name: test
num_bytes: 847367
num_examples: 1200
download_size: 20999544
dataset_size: 61840809
---
# Dataset Card for "anli-amr-amrlib"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Xwin-LM__Xwin-Math-70B-V1.0 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-Math-70B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-Math-70B-V1.0](https://huggingface.co/Xwin-LM/Xwin-Math-70B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-Math-70B-V1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T23:58:40.748061](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-Math-70B-V1.0/blob/main/results_2024-02-09T23-58-40.748061.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6620534022780993,\n\
\ \"acc_stderr\": 0.03099024477372236,\n \"acc_norm\": 0.6648994655093221,\n\
\ \"acc_norm_stderr\": 0.031600005326803196,\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5157978023012086,\n\
\ \"mc2_stderr\": 0.015040824023582368\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809174,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n\
\ \"acc_stderr\": 0.004744132825391526,\n \"acc_norm\": 0.8488348934475204,\n\
\ \"acc_norm_stderr\": 0.0035747765941085046\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469543,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469543\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419871,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419871\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656177,\n\
\ \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656177\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223168,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223168\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573975,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.842911877394636,\n\
\ \"acc_stderr\": 0.013012459322650714,\n \"acc_norm\": 0.842911877394636,\n\
\ \"acc_norm_stderr\": 0.013012459322650714\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5128491620111731,\n\
\ \"acc_stderr\": 0.01671697883804354,\n \"acc_norm\": 0.5128491620111731,\n\
\ \"acc_norm_stderr\": 0.01671697883804354\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.02240967454730416,\n\
\ \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.02240967454730416\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5280312907431551,\n\
\ \"acc_stderr\": 0.012750151802922447,\n \"acc_norm\": 0.5280312907431551,\n\
\ \"acc_norm_stderr\": 0.012750151802922447\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7287581699346405,\n \"acc_stderr\": 0.017986615304030316,\n \
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.017986615304030316\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073142,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073142\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5157978023012086,\n\
\ \"mc2_stderr\": 0.015040824023582368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156886\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5799848369977255,\n \
\ \"acc_stderr\": 0.01359512168852048\n }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-Math-70B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-58-40.748061.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T23-58-40.748061.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- '**/details_harness|winogrande|5_2024-02-09T23-58-40.748061.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T23-58-40.748061.parquet'
- config_name: results
data_files:
- split: 2024_02_09T23_58_40.748061
path:
- results_2024-02-09T23-58-40.748061.parquet
- split: latest
path:
- results_2024-02-09T23-58-40.748061.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-Math-70B-V1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-Math-70B-V1.0](https://huggingface.co/Xwin-LM/Xwin-Math-70B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-Math-70B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T23:58:40.748061](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-Math-70B-V1.0/blob/main/results_2024-02-09T23-58-40.748061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6620534022780993,
"acc_stderr": 0.03099024477372236,
"acc_norm": 0.6648994655093221,
"acc_norm_stderr": 0.031600005326803196,
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5157978023012086,
"mc2_stderr": 0.015040824023582368
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809174,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094087
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.004744132825391526,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.0035747765941085046
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469543,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031093,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031093
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419871,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419871
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656177,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656177
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334334,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334334
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223168,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223168
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573975,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.842911877394636,
"acc_stderr": 0.013012459322650714,
"acc_norm": 0.842911877394636,
"acc_norm_stderr": 0.013012459322650714
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5128491620111731,
"acc_stderr": 0.01671697883804354,
"acc_norm": 0.5128491620111731,
"acc_norm_stderr": 0.01671697883804354
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.02240967454730416,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.02240967454730416
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5280312907431551,
"acc_stderr": 0.012750151802922447,
"acc_norm": 0.5280312907431551,
"acc_norm_stderr": 0.012750151802922447
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.017986615304030316,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.017986615304030316
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5157978023012086,
"mc2_stderr": 0.015040824023582368
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156886
},
"harness|gsm8k|5": {
"acc": 0.5799848369977255,
"acc_stderr": 0.01359512168852048
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pruhtopia/prithvi-mangrove-dataset | ---
license: mit
---
|
open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-qLoRa | ---
pretty_name: Evaluation run of NLUHOPOE/experiment2-cause-qLoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/experiment2-cause-qLoRa](https://huggingface.co/NLUHOPOE/experiment2-cause-qLoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-qLoRa\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T01:27:58.809266](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-qLoRa/blob/main/results_2024-03-02T01-27-58.809266.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6187092967805649,\n\
\ \"acc_stderr\": 0.03277112039995135,\n \"acc_norm\": 0.6247066510159702,\n\
\ \"acc_norm_stderr\": 0.03344268306035278,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.4713263218122602,\n\
\ \"mc2_stderr\": 0.01458246045981096\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627077,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536588\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6210914160525791,\n\
\ \"acc_stderr\": 0.004841238763529372,\n \"acc_norm\": 0.8276239792869946,\n\
\ \"acc_norm_stderr\": 0.003769350079195885\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915333,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915333\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415925,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415925\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.0246624968452098,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.0246624968452098\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.016185444179457175,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.016185444179457175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567654,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.016419874731135025,\n \"mc2\": 0.4713263218122602,\n\
\ \"mc2_stderr\": 0.01458246045981096\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3502653525398029,\n \
\ \"acc_stderr\": 0.013140409455571269\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/experiment2-cause-qLoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-27-58.809266.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T01-27-58.809266.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- '**/details_harness|winogrande|5_2024-03-02T01-27-58.809266.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T01-27-58.809266.parquet'
- config_name: results
data_files:
- split: 2024_03_02T01_27_58.809266
path:
- results_2024-03-02T01-27-58.809266.parquet
- split: latest
path:
- results_2024-03-02T01-27-58.809266.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/experiment2-cause-qLoRa
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/experiment2-cause-qLoRa](https://huggingface.co/NLUHOPOE/experiment2-cause-qLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-qLoRa",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T01:27:58.809266](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__experiment2-cause-qLoRa/blob/main/results_2024-03-02T01-27-58.809266.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6187092967805649,
"acc_stderr": 0.03277112039995135,
"acc_norm": 0.6247066510159702,
"acc_norm_stderr": 0.03344268306035278,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.4713263218122602,
"mc2_stderr": 0.01458246045981096
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627077,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536588
},
"harness|hellaswag|10": {
"acc": 0.6210914160525791,
"acc_stderr": 0.004841238763529372,
"acc_norm": 0.8276239792869946,
"acc_norm_stderr": 0.003769350079195885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915333,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915333
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415925,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415925
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.0246624968452098,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.0246624968452098
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.016185444179457175,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.016185444179457175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567654,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135025,
"mc2": 0.4713263218122602,
"mc2_stderr": 0.01458246045981096
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.3502653525398029,
"acc_stderr": 0.013140409455571269
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
malucoelhaofc/ScottTenorman201V2 | ---
license: openrail
---
|
serbog/job_listing_german_cleaned_bert | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: cleaned_description
dtype: string
- name: title
dtype: string
- name: C1
dtype: int64
- name: C2
dtype: int64
- name: C3
dtype: int64
- name: C4
dtype: int64
- name: C5
dtype: int64
- name: C6
dtype: int64
- name: C7
dtype: int64
- name: C8
dtype: int64
- name: C9
dtype: int64
splits:
- name: train
num_bytes: 1119693900
num_examples: 509834
- name: eval
num_bytes: 261886055
num_examples: 104864
- name: test
num_bytes: 234468000
num_examples: 102796
download_size: 670254315
dataset_size: 1616047955
---
# Dataset Card for "job_listing_german_cleaned_bert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
matlok/python-audio-copilot-training-using-import-knowledge-graphs | ---
license:
- other
pretty_name: >-
python copilot audio training using imports with knowledge graphs
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-audio.import-v1_00000274.parquet
size_categories:
- 10K<n<100K
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- imports
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-to-audio
- audio-to-audio
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Audio Training using Imports with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each imported module for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 52086
- Size: 17.3 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-imports-knowledge-graphs", data_dir="files")
```
|
CyberHarem/y_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of y (Pokémon)
This is the dataset of y (Pokémon), containing 15 images and their tags.
The core tags of this character are `blonde_hair, short_hair, bangs, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 6.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 5.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 8.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 6.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 9.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/y_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/y_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, sleeveless_shirt, looking_at_viewer, red_skirt, smile, solo, black_shirt, pleated_skirt, white_background, black_thighhighs, open_mouth, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | sleeveless_shirt | looking_at_viewer | red_skirt | smile | solo | black_shirt | pleated_skirt | white_background | black_thighhighs | open_mouth | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------------------|:------------|:--------|:-------|:--------------|:----------------|:-------------------|:-------------------|:-------------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
DangDangg/LinhLanCh | ---
license: openrail
---
|
open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1 | ---
pretty_name: Evaluation run of SkunkworksAI/Mistralic-7B-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SkunkworksAI/Mistralic-7B-1](https://huggingface.co/SkunkworksAI/Mistralic-7B-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:22:20.115560](https://huggingface.co/datasets/open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1/blob/main/results_2023-10-28T13-22-20.115560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3366191275167785,\n\
\ \"em_stderr\": 0.004839388843031059,\n \"f1\": 0.43708682885906275,\n\
\ \"f1_stderr\": 0.004627060310059935,\n \"acc\": 0.44050675782818416,\n\
\ \"acc_stderr\": 0.010231909076615354\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3366191275167785,\n \"em_stderr\": 0.004839388843031059,\n\
\ \"f1\": 0.43708682885906275,\n \"f1_stderr\": 0.004627060310059935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1106899166034875,\n \
\ \"acc_stderr\": 0.008642172551392479\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838227\n\
\ }\n}\n```"
repo_url: https://huggingface.co/SkunkworksAI/Mistralic-7B-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|drop|3_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-22-20.115560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-22-20.115560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|winogrande|5_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-22-20.115560.parquet'
- config_name: results
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- results_2023-10-11T09-21-21.065888.parquet
- split: 2023_10_28T13_22_20.115560
path:
- results_2023-10-28T13-22-20.115560.parquet
- split: latest
path:
- results_2023-10-28T13-22-20.115560.parquet
---
# Dataset Card for Evaluation run of SkunkworksAI/Mistralic-7B-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SkunkworksAI/Mistralic-7B-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [SkunkworksAI/Mistralic-7B-1](https://huggingface.co/SkunkworksAI/Mistralic-7B-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:22:20.115560](https://huggingface.co/datasets/open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1/blob/main/results_2023-10-28T13-22-20.115560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3366191275167785,
"em_stderr": 0.004839388843031059,
"f1": 0.43708682885906275,
"f1_stderr": 0.004627060310059935,
"acc": 0.44050675782818416,
"acc_stderr": 0.010231909076615354
},
"harness|drop|3": {
"em": 0.3366191275167785,
"em_stderr": 0.004839388843031059,
"f1": 0.43708682885906275,
"f1_stderr": 0.004627060310059935
},
"harness|gsm8k|5": {
"acc": 0.1106899166034875,
"acc_stderr": 0.008642172551392479
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838227
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JaMyron/JayEvansV1 | ---
license: openrail
---
|
ideepankarsharma2003/ImageClassificationStableDiffusion_small | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ai_gen
'1': human
splits:
- name: train
num_bytes: 21159539615.0
num_examples: 36000
- name: validation
num_bytes: 625130215.944
num_examples: 1514
- name: test
num_bytes: 1073534175.0
num_examples: 2000
download_size: 22249314646
dataset_size: 22858204005.944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
SCIR-HI/PseudoMD-1M | ---
license: apache-2.0
task_categories:
- translation
- text2text-generation
language:
- en
tags:
- chemistry
- biology
- medical
size_categories:
- 1M<n<10M
---
Pre-training dataset used in paper "[From Artificially Real to Real: Leveraging Pseudo Data from Large Language Models for Low-Resource Molecule Discovery](https://arxiv.org/abs/2309.05203)" (AAAI 2024)
PseudoMD-1M dataset is the first artificially-real dataset for cross-modal molecule discovery, which consists of 1,020,139 pseudo molecule-description pairs. Every molecule is represented using its Canonical SMILES notation, sourced from PubChem via the PUG View API. On average, each description within PseudoMD-1M contains 5.11 sentences, 106.47 words, and 165.07 tokens.
### Citation
If you found the dataset useful, please cite:
```bibtex
@article{chen2023artificially,
title={From Artificially Real to Real: Leveraging Pseudo Data from Large Language Models for Low-Resource Molecule Discovery},
author={Chen, Yuhan and Xi, Nuwa and Du, Yanrui and Wang, Haochun and Jianyu, Chen and Zhao, Sendong and Qin, Bing},
journal={arXiv preprint arXiv:2309.05203},
year={2023}
}
``` |
liuyanchen1015/MULTI_VALUE_cola_double_obj_order | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 20743
num_examples: 276
- name: test
num_bytes: 20704
num_examples: 284
- name: train
num_bytes: 156480
num_examples: 2169
download_size: 100955
dataset_size: 197927
---
# Dataset Card for "MULTI_VALUE_cola_double_obj_order"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Carlosgg14/leorio | ---
license: openrail
---
|
JetBrains-Research/lca-bug-localization | ---
language:
- en
license: other
task_categories:
- text-generation
pretty_name: LCA (Bug Localization)
tags:
- code
dataset_info:
- config_name: java
features:
- name: id
dtype: int64
- name: text_id
dtype: string
- name: repo_owner
dtype: string
- name: repo_name
dtype: string
- name: issue_url
dtype: string
- name: pull_url
dtype: string
- name: comment_url
dtype: string
- name: links_count
dtype: int64
- name: issue_title
dtype: string
- name: link_keyword
dtype: string
- name: issue_body
dtype: string
- name: base_sha
dtype: string
- name: head_sha
dtype: string
- name: diff_url
dtype: string
- name: diff
dtype: string
- name: changed_files
dtype: string
- name: changed_files_count
dtype: int64
- name: java_changed_files_count
dtype: int64
- name: py_changed_files_count
dtype: int64
- name: kt_changed_files_count
dtype: int64
- name: code_changed_files_count
dtype: int64
- name: changed_files_exts
dtype: string
- name: pull_create_at
dtype: timestamp[s]
- name: stars
dtype: int64
- name: language
dtype: string
- name: languages
dtype: string
- name: license
dtype: string
splits:
- name: dev
num_bytes: 28775486
num_examples: 2703
download_size: 8312510
dataset_size: 28775486
- config_name: kt
features:
- name: id
dtype: int64
- name: text_id
dtype: string
- name: repo_owner
dtype: string
- name: repo_name
dtype: string
- name: issue_url
dtype: string
- name: pull_url
dtype: string
- name: comment_url
dtype: string
- name: links_count
dtype: int64
- name: issue_title
dtype: string
- name: link_keyword
dtype: string
- name: issue_body
dtype: string
- name: base_sha
dtype: string
- name: head_sha
dtype: string
- name: diff_url
dtype: string
- name: diff
dtype: string
- name: changed_files
dtype: string
- name: changed_files_count
dtype: int64
- name: java_changed_files_count
dtype: int64
- name: py_changed_files_count
dtype: int64
- name: kt_changed_files_count
dtype: int64
- name: code_changed_files_count
dtype: int64
- name: changed_files_exts
dtype: string
- name: pull_create_at
dtype: timestamp[s]
- name: stars
dtype: int64
- name: language
dtype: string
- name: languages
dtype: string
- name: license
dtype: string
splits:
- name: dev
num_bytes: 5417683
num_examples: 645
download_size: 1707311
dataset_size: 5417683
- config_name: mixed
features:
- name: id
dtype: int64
- name: text_id
dtype: string
- name: repo_owner
dtype: string
- name: repo_name
dtype: string
- name: issue_url
dtype: string
- name: pull_url
dtype: string
- name: comment_url
dtype: string
- name: links_count
dtype: int64
- name: issue_title
dtype: string
- name: link_keyword
dtype: string
- name: issue_body
dtype: string
- name: base_sha
dtype: string
- name: head_sha
dtype: string
- name: diff_url
dtype: string
- name: diff
dtype: string
- name: changed_files
dtype: string
- name: changed_files_count
dtype: int64
- name: java_changed_files_count
dtype: int64
- name: py_changed_files_count
dtype: int64
- name: kt_changed_files_count
dtype: int64
- name: code_changed_files_count
dtype: int64
- name: changed_files_exts
dtype: string
- name: pull_create_at
dtype: timestamp[s]
- name: stars
dtype: int64
- name: language
dtype: string
- name: languages
dtype: string
- name: license
dtype: string
splits:
- name: dev
num_bytes: 95282639
num_examples: 2686
download_size: 30911114
dataset_size: 95282639
- config_name: py
features:
- name: id
dtype: int64
- name: text_id
dtype: string
- name: repo_owner
dtype: string
- name: repo_name
dtype: string
- name: issue_url
dtype: string
- name: pull_url
dtype: string
- name: comment_url
dtype: string
- name: links_count
dtype: int64
- name: issue_title
dtype: string
- name: link_keyword
dtype: string
- name: issue_body
dtype: string
- name: base_sha
dtype: string
- name: head_sha
dtype: string
- name: diff_url
dtype: string
- name: diff
dtype: string
- name: changed_files
dtype: string
- name: changed_files_count
dtype: int64
- name: java_changed_files_count
dtype: int64
- name: py_changed_files_count
dtype: int64
- name: kt_changed_files_count
dtype: int64
- name: code_changed_files_count
dtype: int64
- name: changed_files_exts
dtype: string
- name: pull_create_at
dtype: timestamp[s]
- name: stars
dtype: int64
- name: language
dtype: string
- name: languages
dtype: string
- name: license
dtype: string
splits:
- name: dev
num_bytes: 30149649
num_examples: 4568
download_size: 10930678
dataset_size: 30149649
configs:
- config_name: java
data_files:
- split: dev
path: java/dev-*
- config_name: kt
data_files:
- split: dev
path: kt/dev-*
- config_name: mixed
data_files:
- split: dev
path: mixed/dev-*
- config_name: py
data_files:
- split: dev
path: py/dev-*
---
# LCA (Bug Localization)
This is the data for **Bug Localization** benchmark as part of LCA.
## How-to
1. Since the dataset is private, if you haven't used HF Hub before, add your token via `huggingface-cli` first:
```
huggingface-cli login
```
2. List all the available configs via [`datasets.get_dataset_config_names`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.get_dataset_config_names) and choose an appropriate one
3. Load the data via [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.3/en/package_reference/loading_methods#datasets.load_dataset):
```py
from datasets import load_dataset
# Select a configuration from ["py", "java", "kt", "mixed"]
configuration = "py"
# Select a split from ["dev", "train", "test"]
split = "dev"
# Load data
dataset = load_dataset("JetBrains-Research/lca-bug-localization", configuration, split=split)
```
4. Load repos via [`hf_hub_download`](https://huggingface.co/docs/huggingface_hub/v0.20.3/en/package_reference/file_download#huggingface_hub.hf_hub_download)
```py
from huggingface_hub import hf_hub_download
from datasets import load_dataset
# Load json with list of repos' .tar.gz file paths
paths_json = load_dataset("JetBrains-Research/lca-bug-localization", data_files="paths.json")
# Load each repo in .tar.gz format, unzip, delete archive
repos = paths_json["repos"][0]
for i, repo_tar_path in enumerate(repos):
local_repo_tars = hf_hub_download(
"JetBrains-Research/lca-bug-localization",
filename=repo_tar_path,
repo_type="dataset",
local_dir="local/dir"
)
result = subprocess.run(["tar", "-xzf", local_repo_tars, "-C", os.path.join("local/dir", "repos")])
os.remove(local_repo_tars)
```
## Dataset Structure
TODO: some overall structure or repo
### Bug localization data
This section concerns configuration with *full data* about each commit (no `-labels` suffix).
Each example has the following fields:
| **Field** | **Description** |
|:------------------:|:----------------------------------------:|
| `repo_owner` | Bug issue repository owner. |
| `repo_name` | Bug issue repository name. |
| `issue_url` | GitHub link to issue <br> `https://github.com/{repo_owner}/{repo_name}/issues/{issue_id}`. |
| `pull_url` | GitHub link to pull request <br> `https://github.com/{repo_owner}/{repo_name}/pull/{pull_id}`. |
| `comment_url` | GitHub link to comment with pull request to issue reference <br> `https://github.com/{repo_owner}/{repo_name}/pull/{pull_id}#issuecomment-{comment_id}`. |
| `issue_title` | Issue title. |
| `issue_body` | Issue body. |
| `base_sha` | Pull request base sha. |
| `head_sha` | Pull request head sha. |
| `diff_url` | Pull request diff url between base and head sha <br> `https://github.com/{repo_owner}/{repo_name}/compare/{base_sha}...{head_sha}`. |
| `diff` | Pull request diff content. |
| `changed_files` | List of changed files parsed from diff. |
| `changed_files_exts` | Dict from changed files extension to count. |
| `changed_files_count` | Number of changed files. |
| `java_changed_files_count` | Number of changed `.java` files. |
| `kt_changed_files_count` | Number of changed `.kt` files. |
| `py_changed_files_count` | Number of changed `.py` files. |
| `code_changed_files_count` | Number of changed `.java`, `.kt` or `.py` files. |
| `pull_create_at` | Data of pull request creation in format yyyy-mm-ddThh:mm:ssZ. |
| `stars` | Number of repo stars. |
### Repos data
TODO: describe repos data as `.tar.gz` archives with list of repos metadata |
freshpearYoon/train_free_17 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604550104
num_examples: 10000
download_size: 1278390834
dataset_size: 9604550104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JMYasir/trReviews-ds-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 0
dataset_size: 1392332.0
---
# Dataset Card for "trReviews-ds-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_80 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24182773344.75
num_examples: 251778
download_size: 21979407663
dataset_size: 24182773344.75
---
# Dataset Card for "chunk_80"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_125 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1300756492.0
num_examples: 255451
download_size: 1327471348
dataset_size: 1300756492.0
---
# Dataset Card for "chunk_125"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
drja23/geosignal-size8000 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3125515
num_examples: 1000
download_size: 1794078
dataset_size: 3125515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ParZiVal04/Purr-Data_example_source_codes | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- code
size_categories:
- n<1K
---
Purr-Data Patch Source Code Dataset:
- This dataset is designed for training language models to generate source code for Purr-Data patches. It focuses specifically on patches that output a particular message when a "bang" object is clicked.
Dataset Creation:
- The dataset was created with the goal of evaluating the ability of large language models like Google's 2B GEMMA to be fine-tuned for Purr-Data source code generation.
Dataset Characteristics:
- Content: Each data point consists of two parts:
- Instruction: A textual description of the desired Purr-Data patch functionality. This description focuses on the message the patch should output.
example instruction => "can you make a Purr-Data patch that displays a funny message?
- Response: The corresponding Purr-Data source code that fulfills the given instruction:
example response =>
#N canvas 761 0 768 809 10;
#X obj 260 170 bng 15 250 50 0 empty empty empty 17 7 0 10 #fcfcfc #000000 #000000;
#X msg 334 25 What do you call a fish with no eyes? Fsh!;
#X obj 427 335 print; #X connect 0 0 1 0;
#X connect 1 0 2 0;
- Focus: The dataset is restricted to examples where the patch functionality centers around printing a specific message on a bang click.
Potential Uses:
This dataset can be used for various purposes, including:
- Fine-tuning large language models like GEMMA for Purr-Data source code generation.
- Research on text-to-code techniques for visual programming languages.
- Development of code generation tools for Purr-Data.
Note: This dataset provides a starting point for training, and can be further expanded to include more complex Purr-Data functionalities beyond basic message printing. |
Nexdata/43411_Images_464_Categories_of_Trademarks_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
43,411 Images-464 Categories of Trademarks Data. The collecting environments include indoor and outdoor scenes. in this dataset, the image is clear without watermark, each image contains at least one trademark. The dataset can be used for scene recognition and trademark classification.
For more details, please refer to the link: https://www.nexdata.ai/dataset/175?source=Huggingface
## Date size
43,411 images, 464 types of trademarks
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple types of trademarks, multiple scenes
## Data format
the image data format is .jpg, png, jpeg , the annotation file format is .json
## Annotation content
rectangular bounding boxes of trademarks
## Accuracy
the accuracy of rectangular bounding boxes is not less than 95%
# Licensing Information
Commercial License
|
KaiLv/UDR_SST-2 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 853094
num_examples: 6911
- name: test
num_bytes: 224519
num_examples: 1821
- name: debug
num_bytes: 617046
num_examples: 5000
download_size: 1109867
dataset_size: 1694659
---
# Dataset Card for "UDR_SST-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tonywu71/PokemonCards_fixed | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: image_url
dtype: string
- name: caption
dtype: string
- name: name
dtype: string
- name: hp
dtype: int64
- name: set_name
dtype: string
splits:
- name: train
num_bytes: 9474973.87624629
num_examples: 13088
download_size: 3028812
dataset_size: 9474973.87624629
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
language:
- en
---
Fix for [TheFusion21/PokemonCards](https://huggingface.co/datasets/TheFusion21/PokemonCards), where the images with broken links were discarded. Tested while fine-tuning [HuggingFaceM4/idefics-9b](https://huggingface.co/HuggingFaceM4/idefics-9b) with LoRA using my custom Git repository: https://github.com/tonywu71/idefics-project. |
autoevaluate/autoeval-eval-futin__guess-vi-4444ed-2051267099 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-66b
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-66b
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
CyberHarem/moroboshi_kirari_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of moroboshi_kirari/諸星きらり/모로보시키라리 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of moroboshi_kirari/諸星きらり/모로보시키라리 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, long_hair, brown_eyes, hair_ornament, star_hair_ornament, breasts, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 489.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moroboshi_kirari_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 332.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moroboshi_kirari_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 967 | 612.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moroboshi_kirari_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 448.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moroboshi_kirari_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 967 | 799.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moroboshi_kirari_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/moroboshi_kirari_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, :3, looking_at_viewer, navel, smile, solo, cleavage, large_breasts, simple_background, blush, star_(symbol), white_background, \m/, underboob, white_bikini |
| 1 | 8 |  |  |  |  |  | 1girl, dress, smile, solo, star_(symbol), :3, necklace, \m/, cleavage, looking_at_viewer, large_breasts, open_mouth, polka_dot, simple_background, white_background, blush, jacket |
| 2 | 5 |  |  |  |  |  | 1girl, :3, :d, dress, open_mouth, solo, star_(symbol), medium_breasts, necklace, bracelet, \m/, blush |
| 3 | 11 |  |  |  |  |  | 1girl, bracelet, dress, open_mouth, solo, star_(symbol), twintails, :3, food, \m/, hair_bow, necklace, gloves, ribbon, :d |
| 4 | 5 |  |  |  |  |  | 1girl, :3, bangs, blush, hair_bow, puffy_short_sleeves, smile, solo, two_side_up, wavy_hair, balloon, looking_at_viewer, open_mouth, striped, white_gloves, bracelet, detached_sleeves, ribbon, simple_background, star_(symbol), white_background, asymmetrical_legwear, candy_hair_ornament, earrings, frilled_dress, heart_hair_ornament, holding, mini_hat, orange_hair, outstretched_arms, polka_dot, stuffed_animal, thighhighs, top_hat, very_long_hair |
| 5 | 11 |  |  |  |  |  | looking_at_viewer, star_(symbol), :3, blush, open_mouth, skirt, 1girl, black_gloves, ghost, hair_bow, solo, witch_hat, bangs, frills, halloween, puffy_short_sleeves, dress, one_eye_closed, :d, striped, ;d, candy_hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | :3 | looking_at_viewer | navel | smile | solo | cleavage | large_breasts | simple_background | blush | star_(symbol) | white_background | \m/ | underboob | white_bikini | dress | necklace | open_mouth | polka_dot | jacket | :d | medium_breasts | bracelet | twintails | food | hair_bow | gloves | ribbon | bangs | puffy_short_sleeves | two_side_up | wavy_hair | balloon | striped | white_gloves | detached_sleeves | asymmetrical_legwear | candy_hair_ornament | earrings | frilled_dress | heart_hair_ornament | holding | mini_hat | orange_hair | outstretched_arms | stuffed_animal | thighhighs | top_hat | very_long_hair | skirt | black_gloves | ghost | witch_hat | frills | halloween | one_eye_closed | ;d |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----|:--------------------|:--------|:--------|:-------|:-----------|:----------------|:--------------------|:--------|:----------------|:-------------------|:------|:------------|:---------------|:--------|:-----------|:-------------|:------------|:---------|:-----|:-----------------|:-----------|:------------|:-------|:-----------|:---------|:---------|:--------|:----------------------|:--------------|:------------|:----------|:----------|:---------------|:-------------------|:-----------------------|:----------------------|:-----------|:----------------|:----------------------|:----------|:-----------|:--------------|:--------------------|:-----------------|:-------------|:----------|:-----------------|:--------|:---------------|:--------|:------------|:---------|:------------|:-----------------|:-----|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | X | | | | X | X | | X | | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | | X | | | | | X | | X | | | X | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | X | | | X | X | X | X | | | | | | X | X | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | X | | | X | | | | X | X | | | | | X | | X | | | X | | | | | X | | | X | X | | | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
4n3mone/test | ---
license: mit
---
|
KentoTsu/pablok | ---
license: openrail
---
|
open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO | ---
pretty_name: Evaluation run of adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO](https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T22:48:03.858262](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO/blob/main/results_2024-01-10T22-48-03.858262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.618311155719092,\n\
\ \"acc_stderr\": 0.03254669493878394,\n \"acc_norm\": 0.6264661480893854,\n\
\ \"acc_norm_stderr\": 0.033214129392877,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.4714753463607863,\n\
\ \"mc2_stderr\": 0.015440450531261194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255795,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937742\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5770762796255726,\n\
\ \"acc_stderr\": 0.004930138842768223,\n \"acc_norm\": 0.7703644692292372,\n\
\ \"acc_norm_stderr\": 0.0041973886269400665\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.0398124054371786,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.0398124054371786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.025634258115554955,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.025634258115554955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.02777253333421898,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.02777253333421898\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635467,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635467\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7394957983193278,\n \"acc_stderr\": 0.02851025151234193,\n \
\ \"acc_norm\": 0.7394957983193278,\n \"acc_norm_stderr\": 0.02851025151234193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n\
\ \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n\
\ \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n\
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946007,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946007\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n\
\ \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.4100558659217877,\n\
\ \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829707,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829707\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.01274920600765746,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.01274920600765746\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933116,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933116\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.4714753463607863,\n\
\ \"mc2_stderr\": 0.015440450531261194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638261\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26914329037149354,\n \
\ \"acc_stderr\": 0.012216595457292728\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|arc:challenge|25_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|gsm8k|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hellaswag|10_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-48-03.858262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T22-48-03.858262.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- '**/details_harness|winogrande|5_2024-01-10T22-48-03.858262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T22-48-03.858262.parquet'
- config_name: results
data_files:
- split: 2024_01_10T22_48_03.858262
path:
- results_2024-01-10T22-48-03.858262.parquet
- split: latest
path:
- results_2024-01-10T22-48-03.858262.parquet
---
# Dataset Card for Evaluation run of adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO](https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T22:48:03.858262](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO/blob/main/results_2024-01-10T22-48-03.858262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.618311155719092,
"acc_stderr": 0.03254669493878394,
"acc_norm": 0.6264661480893854,
"acc_norm_stderr": 0.033214129392877,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.4714753463607863,
"mc2_stderr": 0.015440450531261194
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255795,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937742
},
"harness|hellaswag|10": {
"acc": 0.5770762796255726,
"acc_stderr": 0.004930138842768223,
"acc_norm": 0.7703644692292372,
"acc_norm_stderr": 0.0041973886269400665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.0398124054371786,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.0398124054371786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.025634258115554955,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.025634258115554955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.02777253333421898,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.02777253333421898
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635467,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635467
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7394957983193278,
"acc_stderr": 0.02851025151234193,
"acc_norm": 0.7394957983193278,
"acc_norm_stderr": 0.02851025151234193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946007,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946007
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868052,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868052
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829707,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829707
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.01274920600765746,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.01274920600765746
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933116,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933116
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.4714753463607863,
"mc2_stderr": 0.015440450531261194
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638261
},
"harness|gsm8k|5": {
"acc": 0.26914329037149354,
"acc_stderr": 0.012216595457292728
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nextpy/CodeExercise-Python-27k-EVOL | ---
license: apache-2.0
---
|
mlabonne/chatml_dpo_pairs | ---
tags:
- dpo
---
# ChatML DPO Pairs
This is a preprocessed version of [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) using the [ChatML](https://huggingface.co/docs/transformers/chat_templating) format.
Like the original dataset, it contains 12k examples from [Orca](https://arxiv.org/abs/2306.02707) style dataset [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca).
Here is the code used to preprocess it:
```python
def chatml_format(example):
# Format system
if len(example['system']) > 0:
message = {"role": "system", "content": example['system']}
system = tokenizer.apply_chat_template([message], tokenize=False)
else:
system = ""
# Format instruction
message = {"role": "user", "content": example['question']}
prompt = tokenizer.apply_chat_template([message], tokenize=False, add_generation_prompt=True)
# Format chosen answer
chosen = example['chatgpt'] + "<|im_end|>\n"
# Format rejected answer
rejected = example['llama2-13b-chat'] + "<|im_end|>\n"
return {
"prompt": system + prompt,
"chosen": chosen,
"rejected": rejected,
}
# Load dataset
dataset = load_dataset("Intel/orca_dpo_pairs")['train']
# Save columns
original_columns = dataset.column_names
# Tokenizer
tokenizer = AutoTokenizer.from_pretrained("teknium/OpenHermes-2.5-Mistral-7B")
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "left"
# Format dataset
dataset = dataset.map(
chatml_format,
remove_columns=original_columns
)
``` |
Eternity-ai/home-emote-interaction-0-1 | ---
license: apache-2.0
---
|
hackathon-somos-nlp-2023/winogrande_train_s_spanish | ---
license: gpl-3.0
task_categories:
- text-classification
language:
- es
pretty_name: Winogrande in Spanish
size_categories:
- n<1K
---
This is the Spanish version of Winogrande Small (640 instances) for training only.
The translation was done manually by a group of experts. The dataset will still be improved in the future.
we also acknowledge Somos-NLP for this achievement. |
mohammadnajeeb/ccc_md | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 34553.0
num_examples: 6
download_size: 34969
dataset_size: 34553.0
---
# Dataset Card for "ccc_md"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carlosejimenez/wikibook-tokenized-block-size-512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 15964254252
num_examples: 7779851
download_size: 7865415536
dataset_size: 15964254252
---
# Dataset Card for "wikibook-tokenized-block-size-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bstdev/touhou_portraits_more | ---
license: agpl-3.0
---
|
MikhailT/hifi-tts | ---
configs:
- config_name: clean
version: 1.0.0
data_files:
- split: train
path: data/train.clean-*
- split: test
path: data/test.clean-*
- split: dev
path: data/dev.clean-*
- config_name: other
version: 1.0.0
data_files:
- split: train
path: data/train.other-*
- split: test
path: data/test.other-*
- split: dev
path: data/dev.other-*
- config_name: all
version: 1.0.0
data_files:
- split: train.clean
path: data/train.clean-*
- split: train.other
path: data/train.other-*
- split: dev.clean
path: data/dev.clean-*
- split: dev.other
path: data/dev.other-*
- split: test.clean
path: data/test.clean-*
- split: test.other
path: data/test.other-*
dataset_info:
- config_name: clean
features:
- name: speaker
dtype: string
- name: file
dtype: string
- name: duration
dtype: float32
- name: text
dtype: string
- name: text_no_preprocessing
dtype: string
- name: text_normalized
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 44100
splits:
- name: train
num_bytes: 17023899243
num_examples: 125989
- name: dev
num_bytes: 24204633
num_examples: 150
- name: test
num_bytes: 52040552
num_examples: 300
download_size: 16271001158
dataset_size: 17104553676
- config_name: other
features:
- name: speaker
dtype: string
- name: file
dtype: string
- name: duration
dtype: float32
- name: text
dtype: string
- name: text_no_preprocessing
dtype: string
- name: text_normalized
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 44100
splits:
- name: train
num_bytes: 26755286687
num_examples: 196489
- name: dev
num_bytes: 65601521
num_examples: 350
- name: test
num_bytes: 129348882
num_examples: 700
download_size: 25655017468
dataset_size: 26957939607
- config_name: all
features:
- name: speaker
dtype: string
- name: file
dtype: string
- name: duration
dtype: float32
- name: text
dtype: string
- name: text_no_preprocessing
dtype: string
- name: text_normalized
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 44100
splits:
- name: train.clean
num_bytes: 17023899243
num_examples: 125989
- name: train.other
num_bytes: 26755286687
num_examples: 196489
- name: dev.clean
num_bytes: 24204633
num_examples: 150
- name: dev.other
num_bytes: 65601521
num_examples: 350
- name: test.clean
num_bytes: 52040552
num_examples: 300
- name: test.other
num_bytes: 129348882
num_examples: 700
download_size: 7040649041
dataset_size: 44050381518
pretty_name: HiFi TTS
description: >-
Hi-Fi Multi-Speaker English TTS Dataset (Hi-Fi TTS) is based on LibriVox's
public domain audio books and Gutenberg Project texts.
homepage: http://www.openslr.org/109
language:
- en
license:
- cc-by-4.0
citation: |
@article{bakhturina2021hi,
title={{Hi-Fi Multi-Speaker English TTS Dataset}},
author={Bakhturina, Evelina and Lavrukhin, Vitaly and Ginsburg, Boris and Zhang, Yang},
journal={arXiv preprint arXiv:2104.01497},
year={2021}
}
task_categories:
- text-to-speech
- text-to-audio
---
# Dataset Card for HiFiTTS
Hi-Fi Multi-Speaker English TTS Dataset (Hi-Fi TTS) is based on LibriVox's public domain audio books and Gutenberg Project texts. |
Nexdata/Korean_Speech_Data_by_Mobile_Phone_Reading | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/British_Children_Speech_Data_by_Microphone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/60?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 291 Korean locals and is recorded in quiet indoor environment. The recordings include economics, entertainment, news, oral, figure, letter. 400 sentences for each speaker. Recording devices are mainstream Android phones and iPhones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/60?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Korean
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
AyoubChLin/CompanyDocuments | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco | ---
pretty_name: Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2_platypus-dolly-guanaco](https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-28T14:27:44.520216](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco/blob/main/results_2023-09-28T14-27-44.520216.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094559,\n \"f1\": 0.04980704697986585,\n\
\ \"f1_stderr\": 0.0013966099124026671,\n \"acc\": 0.2517758484609313,\n\
\ \"acc_stderr\": 0.007026065573457924\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094559,\n\
\ \"f1\": 0.04980704697986585,\n \"f1_stderr\": 0.0013966099124026671\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5035516969218626,\n\
\ \"acc_stderr\": 0.014052131146915848\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|arc:challenge|25_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_28T14_27_44.520216
path:
- '**/details_harness|drop|3_2023-09-28T14-27-44.520216.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-28T14-27-44.520216.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_28T14_27_44.520216
path:
- '**/details_harness|gsm8k|5_2023-09-28T14-27-44.520216.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-28T14-27-44.520216.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hellaswag|10_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T20:05:00.341927.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T20:05:00.341927.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_28T14_27_44.520216
path:
- '**/details_harness|winogrande|5_2023-09-28T14-27-44.520216.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-28T14-27-44.520216.parquet'
- config_name: results
data_files:
- split: 2023_08_31T20_05_00.341927
path:
- results_2023-08-31T20:05:00.341927.parquet
- split: 2023_09_28T14_27_44.520216
path:
- results_2023-09-28T14-27-44.520216.parquet
- split: latest
path:
- results_2023-09-28T14-27-44.520216.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_platypus-dolly-guanaco](https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-28T14:27:44.520216](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco/blob/main/results_2023-09-28T14-27-44.520216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094559,
"f1": 0.04980704697986585,
"f1_stderr": 0.0013966099124026671,
"acc": 0.2517758484609313,
"acc_stderr": 0.007026065573457924
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094559,
"f1": 0.04980704697986585,
"f1_stderr": 0.0013966099124026671
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915848
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_who_what | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 476008
num_examples: 2385
- name: test
num_bytes: 4702599
num_examples: 23944
- name: train
num_bytes: 4329233
num_examples: 21586
download_size: 5782997
dataset_size: 9507840
---
# Dataset Card for "MULTI_VALUE_qqp_who_what"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/62de9313 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1332
dataset_size: 180
---
# Dataset Card for "62de9313"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
getawayfrommeXD/trec | ---
dataset_info:
features:
- name: label-coarse
dtype: int64
- name: text
dtype: string
- name: clean_text
dtype: string
splits:
- name: train
num_bytes: 485569
num_examples: 4952
- name: validation
num_bytes: 50526
num_examples: 500
- name: test
num_bytes: 36238
num_examples: 500
download_size: 0
dataset_size: 572333
---
# Dataset Card for "trec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ahmadsameh8/songlyrics | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: target_text
dtype: string
splits:
- name: train
num_bytes: 1921042
num_examples: 822
- name: validation
num_bytes: 251598
num_examples: 102
- name: test
num_bytes: 243625
num_examples: 104
download_size: 1059520
dataset_size: 2416265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
kimnt93/OpenOrca-50k | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 85583064
num_examples: 50000
download_size: 49265986
dataset_size: 85583064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# OpenOrca-50k Dataset
## Description
OpenOrca-50k is a curated subset of the original Open-Orca dataset available on HuggingFace. This subset contains 50,000 random samples from the main dataset. It has been extracted to serve specific research purposes, especially for those requiring a smaller but representative portion of the original dataset.
Each entry in the dataset has the following structure:
- `id`: The unique identifier for the sample.
- `system_prompt`: System-generated prompt or context for the interaction.
- `question`: The main question posed, corresponding to the given prompt.
- `response`: The system's or model's response to the question.
## Source
The original dataset can be found [here](https://huggingface.co/datasets/Open-Orca/OpenOrca).
## Usage
This dataset is primarily tailored for researchers and machine learning practitioners who wish to work with a smaller version of the Open-Orca dataset. It is ideal for swift prototyping or in scenarios with limited computational resources.
To efficiently load the dataset using HuggingFace's datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("kimnt93/OpenOrca-50k")
```
## License
[Open-Orca](https://huggingface.co/datasets/Open-Orca/OpenOrca) |
one-sec-cv12/chunk_19 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 27056049264.375
num_examples: 281693
download_size: 24232487237
dataset_size: 27056049264.375
---
# Dataset Card for "chunk_19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/semeval-mono-test | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 83230306
num_examples: 34272
download_size: 44874416
dataset_size: 83230306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.