datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
pachequinho/restaurant_reviews | ---
license: apache-2.0
---
|
pytorch-survival/gbsg_pycox | ---
dataset_info:
features:
- name: x0
dtype: float32
- name: x1
dtype: float32
- name: x2
dtype: float32
- name: x3
dtype: float32
- name: x4
dtype: float32
- name: x5
dtype: float32
- name: x6
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int32
splits:
- name: train
num_bytes: 80352
num_examples: 2232
download_size: 34711
dataset_size: 80352
---
# Dataset Card for "gbsg_pycox"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TryOnVirtual/VITON-HD-TEST | ---
license: cc-by-4.0
--- |
peldrak/riviera_labeled_split2 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 99881509.0
num_examples: 410
download_size: 99800081
dataset_size: 99881509.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hwpark12/llama2_custom_kr | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
turuta/Multi30k-uk | ---
license: unknown
task_categories:
- translation
- text-generation
language:
- uk
- en
pretty_name: ukr-multi30k
size_categories:
- 10K<n<100K
tags:
- common
- multi30k
- ukrainian
---
## Dataset Multi30k: English-Ukrainian variation
Multi30K dataset is designed to develop multilingual multimodal researches.
Initially this dataset extends the Flickr30K dataset by adding German translations. The descriptions were collected from a crowdsourcing platform, while the translations were collected from professionally contracted translators.
We present a variation of this dataset manually translated for Ukrainian language.
Paper:
```python
@inproceedings{saichyshyna-etal-2023-extension,
title = "Extension {M}ulti30{K}: Multimodal Dataset for Integrated Vision and Language Research in {U}krainian",
author = "Saichyshyna, Nataliia and
Maksymenko, Daniil and
Turuta, Oleksii and
Yerokhin, Andriy and
Babii, Andrii and
Turuta, Olena",
booktitle = "Proceedings of the Second Ukrainian Natural Language Processing Workshop (UNLP)",
month = may,
year = "2023",
address = "Dubrovnik, Croatia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.unlp-1.7",
pages = "54--61",
abstract = "We share the results of the project within the well-known Multi30k dataset dedicated to improving machine translation of text from English into Ukrainian. The main task was to manually prepare the dataset and improve the translation of texts. The importance of collecting such datasets for low-resource languages for improving the quality of machine translation has been discussed. We also studied the features of translations of words and sentences with ambiguous meanings.The collection of multimodal datasets is essential for natural language processing tasks because it allows the development of more complex and comprehensive machine learning models that can understand and analyze different types of data. These models can learn from a variety of data types, including images, text, and audio, for more accurate and meaningful results.",
}
``` |
open-llm-leaderboard/details_chargoddard__internlm2-20b-llama | ---
pretty_name: Evaluation run of chargoddard/internlm2-20b-llama
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/internlm2-20b-llama](https://huggingface.co/chargoddard/internlm2-20b-llama)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-20b-llama\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T17:52:12.379059](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-20b-llama/blob/main/results_2024-01-18T17-52-12.379059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6745878141840577,\n\
\ \"acc_stderr\": 0.03167073179485568,\n \"acc_norm\": 0.6749429408040124,\n\
\ \"acc_norm_stderr\": 0.032336180832304856,\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5413315935552325,\n\
\ \"mc2_stderr\": 0.015717993914435745\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251098,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756558\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n\
\ \"acc_stderr\": 0.004786075107572185,\n \"acc_norm\": 0.8312089225253934,\n\
\ \"acc_norm_stderr\": 0.00373801773403787\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708045,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059004,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059004\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745647,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745647\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604549,\n \"\
acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604549\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n\
\ \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562066,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562066\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645354,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646508,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646508\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \
\ \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649412,\n \"\
acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649412\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595694,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508783,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508783\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194164,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194164\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899115,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899115\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.02346842983245114,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.02346842983245114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5058670143415906,\n\
\ \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.5058670143415906,\n\
\ \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5413315935552325,\n\
\ \"mc2_stderr\": 0.015717993914435745\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \
\ \"acc_stderr\": 0.01254183081546149\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/internlm2-20b-llama
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|arc:challenge|25_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|gsm8k|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hellaswag|10_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-18-39.754211.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T17-52-12.379059.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- '**/details_harness|winogrande|5_2024-01-18T13-18-39.754211.parquet'
- split: 2024_01_18T17_52_12.379059
path:
- '**/details_harness|winogrande|5_2024-01-18T17-52-12.379059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T17-52-12.379059.parquet'
- config_name: results
data_files:
- split: 2024_01_18T13_18_39.754211
path:
- results_2024-01-18T13-18-39.754211.parquet
- split: 2024_01_18T17_52_12.379059
path:
- results_2024-01-18T17-52-12.379059.parquet
- split: latest
path:
- results_2024-01-18T17-52-12.379059.parquet
---
# Dataset Card for Evaluation run of chargoddard/internlm2-20b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/internlm2-20b-llama](https://huggingface.co/chargoddard/internlm2-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-20b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T17:52:12.379059](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-20b-llama/blob/main/results_2024-01-18T17-52-12.379059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6745878141840577,
"acc_stderr": 0.03167073179485568,
"acc_norm": 0.6749429408040124,
"acc_norm_stderr": 0.032336180832304856,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5413315935552325,
"mc2_stderr": 0.015717993914435745
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251098,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756558
},
"harness|hellaswag|10": {
"acc": 0.6414060944035053,
"acc_stderr": 0.004786075107572185,
"acc_norm": 0.8312089225253934,
"acc_norm_stderr": 0.00373801773403787
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708045,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059004,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059004
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604549,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604549
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562066,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562066
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645354,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646508,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646508
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7436974789915967,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.7436974789915967,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649412,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649412
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595694,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508783,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508783
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194164,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899115,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.02346842983245114,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.02346842983245114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5058670143415906,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.5058670143415906,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5413315935552325,
"mc2_stderr": 0.015717993914435745
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jonathang/dreambooth-hackathon-images-proteins | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3961830.0
num_examples: 17
download_size: 3905517
dataset_size: 3961830.0
---
# Dataset Card for "dreambooth-hackathon-images-proteins"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_aint_before_main | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 55937
num_examples: 123
- name: train
num_bytes: 47104
num_examples: 96
download_size: 77636
dataset_size: 103041
---
# Dataset Card for "MULTI_VALUE_rte_aint_before_main"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erhwenkuo/firefly-train-chinese-zhtw | ---
dataset_info:
features:
- name: kind
dtype: string
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1116720203
num_examples: 1649399
download_size: 800075000
dataset_size: 1116720203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- zh
size_categories:
- 1M<n<10M
---
# Dataset Card for "firefly-train-chinese-zhtw"
## 資料集摘要
本資料集主要是應用於專案:[Firefly(流螢): 中文對話式大語言模型](https://github.com/yangjianxin1/Firefly) ,經過訓練後得到的模型 [firefly-1b4](https://huggingface.co/YeungNLP/firefly-bloom-1b4)。
[Firefly(流螢): 中文對話式大語言模型]專案(https://github.com/yangjianxin1/Firefly)收集了 23 個常見的中文資料集,并且對於每種不同的 NLP 任務,由人工書寫若干種指令模板來保證資料的高品質與豐富度。
資料量為115萬 。數據分佈如下圖所示:

訓練資料集的 token 長度分佈如下圖所示,絕大部分資料的長度都小於 600:

原始資料來源:
- [YeungNLP/firefly-train-1.1M](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M)
- [Firefly(流萤): 中文对话式大语言模型](https://github.com/yangjianxin1/Firefly)
## 資料下載清理
1. 下載 chinese-poetry: 最全中文诗歌古典文集数据库 的 Repo
2. 使用 OpenCC 來進行簡繁轉換
3. 使用 Huggingface Datasets 來上傳至 Huggingface Hub
## 資料集結構
```json
{
"kind": "ClassicalChinese",
"input": "將下面句子翻譯成現代文:\n石中央又生一樹,高百餘尺,條幹偃陰為五色,翠葉如盤,花徑尺餘,色深碧,蕊深紅,異香成煙,著物霏霏。",
"target": "大石的中央長著一棵樹,一百多尺高,枝幹是彩色的,樹葉有盤子那樣大,花的直徑有一尺寬,花瓣深藍色,花中飄出奇異的香氣籠罩著周圍,如煙似霧。"
}
```
## 資料欄位
- `kind`: (string) 任務類別
- `input`: (string) 任務輸入
- `target`: (string) 任務輸入目標
## 如何使用
```python
from datasets import load_dataset
dataset = load_dataset("erhwenkuo/firefly-train-chinese-zhtw", split="train")
```
## 許可資訊
資料來源未定義許可資訊
## 引用
```
@misc{Firefly,
author = {Jianxin Yang},
title = {Firefly(流萤): 中文对话式大语言模型},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/yangjianxin1/Firefly}},
}
``` |
ziozzang/deepl-trans-PT-KO | ---
task_categories:
- translation
language:
- ko
- pt
---
This dataset is some wikipedia article with DeepL translation, auto-aggregated.
# String/Corpus pairs
From PT/Portugese to KO/Korean.
# Quality Filtering
- Stripping whole HTML tags.
- removed references and annotation mark.
- Filtered by string length.
---
The strings/corpus are aggregated from wikipedia(pt) using DeepL translated.
whole data collected by Jioh L. Jung<ziozzang@gmail.com>
license: mit
--- |
Tural/MetaMathQA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: type
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 288517804
num_examples: 395000
download_size: 141855468
dataset_size: 288517804
---
# Dataset Card for "MetaMathQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathang/dreambooth-hackathon-images-sbob | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1488165.0
num_examples: 4
download_size: 1489345
dataset_size: 1488165.0
---
# Dataset Card for "dreambooth-hackathon-images-sbob"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yamei/TVCG_Paper_NER | ---
dataset_info:
features:
- name: data
struct:
- name: adjacentArticles
struct:
- name: __typename
dtype: string
- name: next
struct:
- name: __typename
dtype: string
- name: articleId
dtype: string
- name: fno
dtype: string
- name: previous
struct:
- name: __typename
dtype: string
- name: articleId
dtype: string
- name: fno
dtype: string
- name: article
struct:
- name: __typename
dtype: string
- name: abstract
dtype: string
- name: abstracts
list:
- name: __typename
dtype: string
- name: abstractType
dtype: string
- name: content
dtype: string
- name: authors
list:
- name: __typename
dtype: string
- name: affiliation
dtype: string
- name: fullName
dtype: string
- name: givenName
dtype: string
- name: surname
dtype: string
- name: doi
dtype: string
- name: fno
dtype: string
- name: hasPdf
dtype: bool
- name: id
dtype: string
- name: idPrefix
dtype: string
- name: isOpenAccess
dtype: bool
- name: isbn
dtype: 'null'
- name: issn
dtype: string
- name: issueNum
dtype: string
- name: keywords
sequence: string
- name: normalizedAbstract
dtype: string
- name: normalizedTitle
dtype: string
- name: notes
dtype: 'null'
- name: notesType
dtype: 'null'
- name: pages
dtype: string
- name: pubDate
dtype: string
- name: pubType
dtype: string
- name: replicability
struct:
- name: __typename
dtype: string
- name: codeDownloadUrl
dtype: string
- name: codeRepositoryUrl
dtype: string
- name: isEnabled
dtype: bool
- name: showBuyMe
dtype: bool
- name: showRecommendedArticles
dtype: bool
- name: title
dtype: string
- name: year
dtype: string
- name: articleVideos
sequence: 'null'
- name: entities
sequence:
sequence: string
- name: issue
struct:
- name: __typename
dtype: string
- name: downloadables
struct:
- name: __typename
dtype: string
- name: hasCover
dtype: bool
- name: id
dtype: string
- name: idPrefix
dtype: string
- name: issueNum
dtype: string
- name: label
dtype: string
- name: pubType
dtype: string
- name: title
dtype: string
- name: volume
dtype: string
- name: year
dtype: string
- name: recommendedArticles
list:
- name: __typename
dtype: string
- name: abstractUrl
dtype: string
- name: doi
dtype: 'null'
- name: id
dtype: string
- name: parentPublication
struct:
- name: __typename
dtype: string
- name: id
dtype: string
- name: title
dtype: string
- name: title
dtype: string
- name: webExtras
list:
- name: __typename
dtype: string
- name: extension
dtype: string
- name: id
dtype: string
- name: location
dtype: string
- name: name
dtype: string
- name: size
dtype: string
splits:
- name: train
num_bytes: 42952165
num_examples: 5178
download_size: 17356935
dataset_size: 42952165
---
# Dataset Card for "TVCG_Paper_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cyli133/calendareai | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101 | ---
pretty_name: Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-Aya-101
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Mistral-7B-Instruct-Aya-101](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:26:40.729869](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101/blob/main/results_2024-03-09T20-26-40.729869.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6172980252936014,\n\
\ \"acc_stderr\": 0.032867077607959115,\n \"acc_norm\": 0.6226740310135067,\n\
\ \"acc_norm_stderr\": 0.03353287092767944,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5270943122503622,\n\
\ \"mc2_stderr\": 0.015098813242240394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633823,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345426998\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6266679944234216,\n\
\ \"acc_stderr\": 0.004827006520802887,\n \"acc_norm\": 0.8320055765783708,\n\
\ \"acc_norm_stderr\": 0.003730972670511862\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397467,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n\
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886783,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886783\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266864,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924985,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172542,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172542\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790203,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790203\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5270943122503622,\n\
\ \"mc2_stderr\": 0.015098813242240394\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.01173504356412673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \
\ \"acc_stderr\": 0.0134425024027943\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|winogrande|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-26-40.729869.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- results_2024-03-09T20-26-40.729869.parquet
- split: latest
path:
- results_2024-03-09T20-26-40.729869.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-Aya-101
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Mistral-7B-Instruct-Aya-101](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:26:40.729869](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101/blob/main/results_2024-03-09T20-26-40.729869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6172980252936014,
"acc_stderr": 0.032867077607959115,
"acc_norm": 0.6226740310135067,
"acc_norm_stderr": 0.03353287092767944,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5270943122503622,
"mc2_stderr": 0.015098813242240394
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633823,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345426998
},
"harness|hellaswag|10": {
"acc": 0.6266679944234216,
"acc_stderr": 0.004827006520802887,
"acc_norm": 0.8320055765783708,
"acc_norm_stderr": 0.003730972670511862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397467,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886783,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886783
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266864,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924985,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172542,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790203,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790203
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.032357437893550424,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.032357437893550424
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5270943122503622,
"mc2_stderr": 0.015098813242240394
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.01173504356412673
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.0134425024027943
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alkav/customer_feedback | ---
dataset_info:
features:
- name: text
struct:
- name: text
dtype: string
splits:
- name: train
num_bytes: 64266
num_examples: 90
- name: test
num_bytes: 7460
num_examples: 10
download_size: 21684
dataset_size: 71726
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdiOO7/data | ---
task_categories:
- text-classification
language:
- en
tags:
- code
pretty_name: Ticket-Category
size_categories:
- n<1K
--- |
apollo-research/Skylion007-openwebtext-tokenizer-EleutherAI-gpt-neox-20b | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 36084316104.0
num_examples: 4402674
download_size: 16697312035
dataset_size: 36084316104.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adi-kmt/gooftagoo | ---
license: apache-2.0
task_categories:
- text-generation
language:
- hi
- en
tags:
- hinglish
- conversation
- hindi
---
## Hindi/Hinglish Conversation Dataset
This repository contains a dataset of conversational text in conversational hindi and hinglish(a mix of Hindi and English languages).
The Conversation Dataset contains multi-turn conversations on multiple topics usually revolving around daily real-life experiences.
A small amount of reasoning tasks have also been added (specifically COT style reasoning and coding) with about 1k samples from Openhermes 2.5.
## Caution
This dataset was generated, please note that some content may not be entirely precise or reflect expert consensus.
Users are encouraged to verify information independently for scholarly or critical purposes. |
sazirarrwth99/last_stage_dataset_kangoroo_SFT | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 143442048
num_examples: 23291
download_size: 44947170
dataset_size: 143442048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JeremiahZ/humaneval_x_llvm_wasm | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: prompt
dtype: string
- name: declaration
dtype: string
- name: canonical_solution
dtype: string
- name: test
dtype: string
- name: example_test
dtype: string
- name: llvm_ir
dtype: string
- name: wat
dtype: string
splits:
- name: test
num_bytes: 4945639
num_examples: 161
download_size: 1096385
dataset_size: 4945639
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "humaneval_x_llvm_wasm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ravithejads/telugu_alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 389859
num_examples: 202
download_size: 172527
dataset_size: 389859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pablao0948/Nelson_Ned | ---
license: openrail
---
|
icaro23/jeanv23 | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-xsum-default-199117-68890145630 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: d0rj/rut5-base-summ
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: d0rj/rut5-base-summ
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@d0rj](https://huggingface.co/d0rj) for evaluating this model. |
DataForGood/taxobservatory_data | ---
language:
- en
size_categories:
- n<1K
---
This dataset contains an initial dump of some 400 country by country reports published
by multinational corporations and collected by the [EU Tax Observatory](https://www.taxobservatory.eu/\).
These files are located in the `pdf` and `xls` directories. The `csv` directory
contains reference files which have been manually curated representing the results of what
should be extracted from the pdf and xlsx files.
|
CyberHarem/kohinata_miho_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kohinata_miho/小日向美穂/코히나타미호 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kohinata_miho/小日向美穂/코히나타미호 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `ahoge, short_hair, black_hair, brown_eyes, breasts, bangs, bow, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 600.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kohinata_miho_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 358.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kohinata_miho_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1169 | 753.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kohinata_miho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 530.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kohinata_miho_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1169 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kohinata_miho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kohinata_miho_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, hair_between_eyes, hair_bow, looking_at_viewer, pink_bow, pink_hairband, solo, white_gloves, bare_shoulders, frills, pink_dress, plaid, sleeveless_dress, :d, collarbone, necklace, open_mouth, simple_background, upper_body, white_background, bare_arms, choker, collared_shirt, hair_intakes, hand_up, pom_pom_(clothes), red_necktie, sleeveless_shirt, white_shirt |
| 1 | 15 |  |  |  |  |  | 1girl, blush, solo, necktie, pink_hairband, smile, skirt, looking_at_viewer, open_mouth, white_thighhighs |
| 2 | 5 |  |  |  |  |  | 1girl, hairband, navel, open_mouth, skirt, solo, thighhighs, microphone, midriff, wrist_cuffs, :d, blush, choker |
| 3 | 10 |  |  |  |  |  | 1girl, blush, school_uniform, solo, sweater_vest, looking_at_viewer, smile, open_mouth, skirt, upper_body |
| 4 | 12 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, white_background, simple_background, collarbone, shirt, hair_between_eyes, upper_body, yellow_eyes |
| 5 | 6 |  |  |  |  |  | 1girl, blush, hairband, looking_at_viewer, plaid_bow, school_uniform, solo, bowtie, hair_bow, love_letter, red_bow, blazer, long_sleeves, pleated_skirt, hair_between_eyes, holding_letter, petals, plaid_skirt, shirt, simple_background, white_background |
| 6 | 5 |  |  |  |  |  | 1girl, blush, solo, cleavage, looking_at_viewer, navel, pink_bikini, smile, white_bikini |
| 7 | 9 |  |  |  |  |  | 1girl, blue_sky, cloud, day, looking_at_viewer, ocean, outdoors, blush, smile, solo, cleavage, collarbone, navel, white_bikini, beach, bikini_skirt, hair_ornament, open_mouth, lens_flare |
| 8 | 6 |  |  |  |  |  | 1girl, blush, hair_flower, looking_at_viewer, side-tie_bikini_bottom, solo, striped_bikini, navel, see-through, white_shirt, bracelet, collarbone, day, outdoors, water, yellow_eyes, blue_sky, wet_shirt |
| 9 | 7 |  |  |  |  |  | 1girl, crop_top, red_gloves, red_headwear, solo, blush, fur-trimmed_gloves, fur-trimmed_skirt, hair_between_eyes, midriff, navel, plaid, red_skirt, santa_costume, santa_hat, beret, looking_at_viewer, open_mouth, :d, capelet, cleavage, earrings, holding_sack, short_sleeves, merry_christmas, red_shirt, thighhighs |
| 10 | 6 |  |  |  |  |  | 1girl, blush, head_wings, looking_at_viewer, puffy_short_sleeves, solo, wrist_cuffs, bat_wings, maid_headdress, pink_dress, frilled_dress, open_mouth, ribbon, :d, bowtie, claw_pose, food, frilled_apron, hair_between_eyes, hairband, heart, simple_background, star_(symbol), striped, waist_apron, white_apron, white_background, yellow_eyes |
| 11 | 6 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, navel, pink_bra, pink_panties, solo, underwear_only, bow_panties, brown_hair, cleavage, cowboy_shot, groin, hair_between_eyes, on_bed, parted_lips, simple_background, stomach, thigh_gap, white_background |
| 12 | 19 |  |  |  |  |  | 1girl, blush, nipples, navel, open_mouth, completely_nude, hetero, solo_focus, sweat, 1boy, pussy, mosaic_censoring, spread_legs, looking_at_viewer, female_pubic_hair, penis, collarbone, saliva, sex, sitting |
| 13 | 7 |  |  |  |  |  | 1girl, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, solo, black_pantyhose, looking_at_viewer, strapless_leotard, wrist_cuffs, black_bowtie, black_leotard, cleavage, white_background, bare_shoulders, blush, cowboy_shot, open_mouth, rabbit_tail, simple_background, white_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | hair_between_eyes | hair_bow | looking_at_viewer | pink_bow | pink_hairband | solo | white_gloves | bare_shoulders | frills | pink_dress | plaid | sleeveless_dress | :d | collarbone | necklace | open_mouth | simple_background | upper_body | white_background | bare_arms | choker | collared_shirt | hair_intakes | hand_up | pom_pom_(clothes) | red_necktie | sleeveless_shirt | white_shirt | necktie | smile | skirt | white_thighhighs | hairband | navel | thighhighs | microphone | midriff | wrist_cuffs | school_uniform | sweater_vest | shirt | yellow_eyes | plaid_bow | bowtie | love_letter | red_bow | blazer | long_sleeves | pleated_skirt | holding_letter | petals | plaid_skirt | cleavage | pink_bikini | white_bikini | blue_sky | cloud | day | ocean | outdoors | beach | bikini_skirt | hair_ornament | lens_flare | hair_flower | side-tie_bikini_bottom | striped_bikini | see-through | bracelet | water | wet_shirt | crop_top | red_gloves | red_headwear | fur-trimmed_gloves | fur-trimmed_skirt | red_skirt | santa_costume | santa_hat | beret | capelet | earrings | holding_sack | short_sleeves | merry_christmas | red_shirt | head_wings | puffy_short_sleeves | bat_wings | maid_headdress | frilled_dress | ribbon | claw_pose | food | frilled_apron | heart | star_(symbol) | striped | waist_apron | white_apron | pink_bra | pink_panties | underwear_only | bow_panties | brown_hair | cowboy_shot | groin | on_bed | parted_lips | stomach | thigh_gap | nipples | completely_nude | hetero | solo_focus | sweat | 1boy | pussy | mosaic_censoring | spread_legs | female_pubic_hair | penis | saliva | sex | sitting | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | black_pantyhose | strapless_leotard | black_bowtie | black_leotard | rabbit_tail | white_leotard |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:-----------|:--------------------|:-----------|:----------------|:-------|:---------------|:-----------------|:---------|:-------------|:--------|:-------------------|:-----|:-------------|:-----------|:-------------|:--------------------|:-------------|:-------------------|:------------|:---------|:-----------------|:---------------|:----------|:--------------------|:--------------|:-------------------|:--------------|:----------|:--------|:--------|:-------------------|:-----------|:--------|:-------------|:-------------|:----------|:--------------|:-----------------|:---------------|:--------|:--------------|:------------|:---------|:--------------|:----------|:---------|:---------------|:----------------|:-----------------|:---------|:--------------|:-----------|:--------------|:---------------|:-----------|:--------|:------|:--------|:-----------|:--------|:---------------|:----------------|:-------------|:--------------|:-------------------------|:-----------------|:--------------|:-----------|:--------|:------------|:-----------|:-------------|:---------------|:---------------------|:--------------------|:------------|:----------------|:------------|:--------|:----------|:-----------|:---------------|:----------------|:------------------|:------------|:-------------|:----------------------|:------------|:-----------------|:----------------|:---------|:------------|:-------|:----------------|:--------|:----------------|:----------|:--------------|:--------------|:-----------|:---------------|:-----------------|:--------------|:-------------|:--------------|:--------|:---------|:--------------|:----------|:------------|:----------|:------------------|:---------|:-------------|:--------|:-------|:--------|:-------------------|:--------------|:--------------------|:--------|:---------|:------|:----------|:------------------|:-------------------|:----------------|:--------------|:------------------|:--------------------|:---------------|:----------------|:--------------|:----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | | | X | | X | X | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | | | X | | | | | | | X | | | X | | | | | X | | | | | | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | X | | | X | X | X | | | | | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | X | | X | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | X | | X | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | X | | X | | | X | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | X | | X | | | X | | | | X | | | X | | | X | X | | X | | | | | | | | | | | | | | X | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 19 |  |  |  |  |  | X | X | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | X | X | | | X | | | X | | X | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
aditijha/chat_vicuna_10k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 47652483
num_examples: 10000
download_size: 22840614
dataset_size: 47652483
---
# Dataset Card for "chat_vicuna_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeswinLLM/VERY_RANDOM_DATA | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': cancel_order
'1': change_order
'2': change_shipping_address
'3': check_cancellation_fee
'4': check_invoice
'5': check_payment_methods
'6': check_refund_policy
'7': complaint
'8': contact_customer_service
'9': contact_human_agent
'10': create_account
'11': delete_account
'12': delivery_options
'13': delivery_period
'14': edit_account
'15': get_invoice
'16': get_refund
'17': newsletter_subscription
'18': payment_issue
'19': place_order
'20': recover_password
'21': registration_problems
'22': review
'23': set_up_shipping_address
'24': switch_account
'25': track_order
'26': track_refund
- name: label_name
dtype: string
splits:
- name: train
num_bytes: 2103516
num_examples: 26872
download_size: 435285
dataset_size: 2103516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/euphyllia_magenta_tenseioujototensaireijounomahoukakumei | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Euphyllia Magenta
This is the dataset of Euphyllia Magenta, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 635 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 635 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 635 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 635 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
cquaker/yi-bagel-clean | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1842240223
num_examples: 798600
download_size: 923343801
dataset_size: 1842240223
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibivibiv/alpaca_tiny18 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 460188894
num_examples: 290900
download_size: 266086538
dataset_size: 460188894
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ybelkada/english_quotes_copy | ---
dataset_info:
features:
- name: quote
dtype: string
- name: author
dtype: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 598359
num_examples: 2508
download_size: 349107
dataset_size: 598359
---
# Dataset Card for "english_quotes_copy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/bangdreamitsmygo | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Bang Dream! It's Mygo!!!!!
This is the image base of bangumi BanG Dream! It's MyGO!!!!!, we detected 23 characters, 3511 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 100 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 163 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 155 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 393 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 314 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 416 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 417 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 559 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 458 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 19 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 9 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 25 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 19 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 7 | [Download](14/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 15 | 26 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 11 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 8 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 41 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 8 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 8 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 337 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
yacahu/misako | ---
license: other
---
|
LangChainDatasets/state-of-the-union-completions | ---
dataset_info:
features:
- name: generations
list:
list:
- name: generation_info
struct:
- name: finish_reason
dtype: string
- name: logprobs
dtype: 'null'
- name: text
dtype: string
- name: ground_truth
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 65981
num_examples: 50
download_size: 50527
dataset_size: 65981
---
# Dataset Card for "state-of-the-union-completions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lilacai/lilac-GSM8K-socratic | ---
tags:
- Lilac
---
# lilac/GSM8K-socratic
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/gsm8k](https://huggingface.co/datasets/gsm8k)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-GSM8K-socratic
```
or from python with:
```py
ll.download("lilacai/lilac-GSM8K-socratic")
```
|
Francesco/liver-disease | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': diseases
'1': ballooning
'2': fibrosis
'3': inflammation
'4': steatosis
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: liver-disease
tags:
- rf100
---
# Dataset Card for liver-disease
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/liver-disease
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
liver-disease
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/liver-disease
### Citation Information
```
@misc{ liver-disease,
title = { liver disease Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/liver-disease } },
url = { https://universe.roboflow.com/object-detection/liver-disease },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
sanagnos/processed_gpt_dataset_max | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 27715208700.0
num_examples: 2253269
download_size: 7902918885
dataset_size: 27715208700.0
---
# Dataset Card for "processed_gpt_dataset_max"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/PKDD_RoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115608907.5
num_examples: 37500
- name: test
num_bytes: 38536305.0
num_examples: 12500
download_size: 211881539
dataset_size: 154145212.5
---
# Dataset Card for "PKDD_RoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
euclaise/tex-stackexchange | ---
dataset_info:
features:
- name: parent_url
dtype: string
- name: parent_score
dtype: string
- name: parent_body
dtype: string
- name: parent_user
dtype: string
- name: parent_title
dtype: string
- name: accepted
dtype: bool
- name: body
dtype: string
- name: score
dtype: string
- name: user
dtype: string
- name: answer_id
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 505025688
num_examples: 190807
download_size: 221660047
dataset_size: 505025688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-4.0
---
# Dataset Card for "tex-stackexchange"
This is a dump of [the TeX StackExchange community](https://tex.stackexchange.com/), converted to markdown.
Data from [The StackExchange data dump](https://archive.org/details/stackexchange), 2023-09-12 release.
Posts where the *questions* included images are excluded. Images in the answers are stripped out. |
epinnock/magicoder-evol-instruct-110k-with-embeddings | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 588197513
num_examples: 111183
download_size: 548070385
dataset_size: 588197513
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T16:42:34.900797](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged/blob/main/results_2023-10-28T16-42-34.900797.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.0005638896908753095,\n \"f1\": 0.07054530201342267,\n\
\ \"f1_stderr\": 0.0015180367806482934,\n \"acc\": 0.42638763311757666,\n\
\ \"acc_stderr\": 0.00983841076151077\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753095,\n\
\ \"f1\": 0.07054530201342267,\n \"f1_stderr\": 0.0015180367806482934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \
\ \"acc_stderr\": 0.007770691416783557\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237983\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T16_42_34.900797
path:
- '**/details_harness|drop|3_2023-10-28T16-42-34.900797.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T16-42-34.900797.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T16_42_34.900797
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-42-34.900797.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-42-34.900797.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-16-44.707859.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T17-16-44.707859.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T16_42_34.900797
path:
- '**/details_harness|winogrande|5_2023-10-28T16-42-34.900797.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T16-42-34.900797.parquet'
- config_name: results
data_files:
- split: 2023_10_03T17_16_44.707859
path:
- results_2023-10-03T17-16-44.707859.parquet
- split: 2023_10_28T16_42_34.900797
path:
- results_2023-10-28T16-42-34.900797.parquet
- split: latest
path:
- results_2023-10-28T16-42-34.900797.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T16:42:34.900797](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16_merged/blob/main/results_2023-10-28T16-42-34.900797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753095,
"f1": 0.07054530201342267,
"f1_stderr": 0.0015180367806482934,
"acc": 0.42638763311757666,
"acc_stderr": 0.00983841076151077
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753095,
"f1": 0.07054530201342267,
"f1_stderr": 0.0015180367806482934
},
"harness|gsm8k|5": {
"acc": 0.08718726307808947,
"acc_stderr": 0.007770691416783557
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237983
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lucadiliello/news_as2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 316302353
num_examples: 1840533
- name: dev
num_bytes: 8925506
num_examples: 51844
- name: test
num_bytes: 8824280
num_examples: 51472
download_size: 35957517
dataset_size: 334052139
---
# Dataset Card for "news_as2"
Answer Sentence Selection version of the NewsQA dataset. For more info, check out the original [repository](https://github.com/lucadiliello/answer-selection). |
Lollitor/PocketMarkedDataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: ID
dtype: string
- name: LABEL
dtype: float64
- name: INPUT
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5646386
num_examples: 7959
- name: validation
num_bytes: 622532
num_examples: 885
download_size: 3168627
dataset_size: 6268918
---
# Dataset Card for "PocketMarkedDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b | ---
pretty_name: Evaluation run of aiplanet/buddhi-128k-chat-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aiplanet/buddhi-128k-chat-7b](https://huggingface.co/aiplanet/buddhi-128k-chat-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T08:28:47.805852](https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b/blob/main/results_2024-04-05T08-28-47.805852.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032425026032818,\n\
\ \"acc_stderr\": 0.0332038274111874,\n \"acc_norm\": 0.6082766963747008,\n\
\ \"acc_norm_stderr\": 0.033879617529146235,\n \"mc1\": 0.4883720930232558,\n\
\ \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6571512658634991,\n\
\ \"mc2_stderr\": 0.01530967561857414\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627082,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.01426412212493821\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6467835092611034,\n\
\ \"acc_stderr\": 0.004769924131304649,\n \"acc_norm\": 0.8399721171081458,\n\
\ \"acc_norm_stderr\": 0.0036588262081016084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.01720857935778758,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.01720857935778758\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.032596251184168264,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.032596251184168264\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709581,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.0253052581318797,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.0253052581318797\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n\
\ \"acc_stderr\": 0.016062290671110476,\n \"acc_norm\": 0.36089385474860336,\n\
\ \"acc_norm_stderr\": 0.016062290671110476\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790205,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6094771241830066,\n \"acc_stderr\": 0.019737008998094597,\n \
\ \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.019737008998094597\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4883720930232558,\n\
\ \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6571512658634991,\n\
\ \"mc2_stderr\": 0.01530967561857414\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38286580742987114,\n \
\ \"acc_stderr\": 0.013389223491820467\n }\n}\n```"
repo_url: https://huggingface.co/aiplanet/buddhi-128k-chat-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|arc:challenge|25_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|gsm8k|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hellaswag|10_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T08-28-47.805852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T08-28-47.805852.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- '**/details_harness|winogrande|5_2024-04-05T08-28-47.805852.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T08-28-47.805852.parquet'
- config_name: results
data_files:
- split: 2024_04_05T08_28_47.805852
path:
- results_2024-04-05T08-28-47.805852.parquet
- split: latest
path:
- results_2024-04-05T08-28-47.805852.parquet
---
# Dataset Card for Evaluation run of aiplanet/buddhi-128k-chat-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aiplanet/buddhi-128k-chat-7b](https://huggingface.co/aiplanet/buddhi-128k-chat-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T08:28:47.805852](https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b/blob/main/results_2024-04-05T08-28-47.805852.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032425026032818,
"acc_stderr": 0.0332038274111874,
"acc_norm": 0.6082766963747008,
"acc_norm_stderr": 0.033879617529146235,
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6571512658634991,
"mc2_stderr": 0.01530967561857414
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627082,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.01426412212493821
},
"harness|hellaswag|10": {
"acc": 0.6467835092611034,
"acc_stderr": 0.004769924131304649,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.0036588262081016084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.039531733777491945,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.039531733777491945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686858,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686858
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.01720857935778758,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.01720857935778758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.032596251184168264,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.032596251184168264
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709581,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.0253052581318797,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.0253052581318797
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36089385474860336,
"acc_stderr": 0.016062290671110476,
"acc_norm": 0.36089385474860336,
"acc_norm_stderr": 0.016062290671110476
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790205,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.019737008998094597,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.019737008998094597
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6571512658634991,
"mc2_stderr": 0.01530967561857414
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.38286580742987114,
"acc_stderr": 0.013389223491820467
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lilacai/lilac-open-asssistant-conversations | ---
tags:
- Lilac
---
# lilac/open-asssistant-conversations
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/OpenAssistant/oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-open-asssistant-conversations
```
or from python with:
```py
ll.download("lilacai/lilac-open-asssistant-conversations")
```
|
heliosprime/twitter_dataset_1713173379 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9040
num_examples: 24
download_size: 12325
dataset_size: 9040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713173379"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VityaVitalich/WordNet-TaxoLLaMA | ---
dataset_info:
features:
- name: hyponym
dtype: string
- name: hypernym
dtype: string
- name: definition
dtype: string
splits:
- name: train
num_bytes: 4974231
num_examples: 44772
- name: test
num_bytes: 5422
num_examples: 49
download_size: 3485218
dataset_size: 4979653
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset card for WordNet-TaxoLLaMA
[TaxoLLaMA](https://huggingface.co/VityaVitalich/TaxoLLaMA) is a model capable of solving Lexical Semantics task with SoTA metrics.
The model was fine-tuned on instructive dataset WordNet-TaxoLLaMA. It consists of hypernym-hyponym pairs sampled from WordNet 3.0. As well, it contains definitions, that were used during training to help model disambiguate senses.
## Input Format
The TaxoLLaMA model was trained to use the following format :
```
<s>[INST] <<SYS>> You are a helpfull assistant. List all the possible words divided with a coma. Your answer should not include anything except the words divided by a coma<</SYS>>
hyponym: tiger (large feline of forests in most of Asia having a tawny coat with black stripes)| hypernyms: [/INST]
```
We recommend you to follow this format, however you are free to change it to suite your task!
## Citation
If you find TaxoLLaMA or WordNet-TaxoLLaMA is useful in your work, please cite it with:
```
@misc{moskvoretskii2024taxollama,
title={TaxoLLaMA: WordNet-based Model for Solving Multiple Lexical Sematic Tasks},
author={Viktor Moskvoretskii and Ekaterina Neminova and Alina Lobanova and Alexander Panchenko and Irina Nikishina},
year={2024},
eprint={2403.09207},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
bigscience-data/roots_vi_ted_talks_iwslt | ---
language: vi
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_vi_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
CyberHarem/sister_gigant_mahoushoujoniakogarete | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sister Gigant/シスタギガント (Mahou Shoujo ni Akogarete)
This is the dataset of Sister Gigant/シスタギガント (Mahou Shoujo ni Akogarete), containing 115 images and their tags.
The core tags of this character are `long_hair, pink_hair, purple_hair, breasts, large_breasts, aqua_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 115 | 81.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sister_gigant_mahoushoujoniakogarete/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 115 | 81.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sister_gigant_mahoushoujoniakogarete/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 212 | 131.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sister_gigant_mahoushoujoniakogarete/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sister_gigant_mahoushoujoniakogarete',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, habit, nun, solo, underboob, green_eyes, looking_at_viewer, long_sleeves, standing, dress |
| 1 | 20 |  |  |  |  |  | 1girl, day, habit, nun, outdoors, solo, blue_sky, cloud, torn_clothes, underboob, upper_body, closed_mouth, veil, blue_eyes, blush |
| 2 | 9 |  |  |  |  |  | 1girl, blush, cloud, nun, solo, habit, open_mouth, blue_sky, day, outdoors, facial_mark, looking_at_viewer, parody |
| 3 | 5 |  |  |  |  |  | facial_mark, 1girl, solo, torn_clothes, closed_mouth, facial_tattoo, hands_up, looking_at_viewer, upper_body, fingernails, frown, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | habit | nun | solo | underboob | green_eyes | looking_at_viewer | long_sleeves | standing | dress | day | outdoors | blue_sky | cloud | torn_clothes | upper_body | closed_mouth | veil | blue_eyes | blush | open_mouth | facial_mark | parody | facial_tattoo | hands_up | fingernails | frown | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:------|:-------|:------------|:-------------|:--------------------|:---------------|:-----------|:--------|:------|:-----------|:-----------|:--------|:---------------|:-------------|:---------------|:-------|:------------|:--------|:-------------|:--------------|:---------|:----------------|:-----------|:--------------|:--------|:--------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | | | X | | | | X | X | X | X | | | | | | X | X | X | X | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | | | | X | X | X | | | | | X | | X | X | X | X | X |
|
CyberHarem/mizuki_seira_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mizuki_seira/水木聖來 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mizuki_seira/水木聖來 (THE iDOLM@STER: Cinderella Girls), containing 196 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, short_hair, breasts, earrings, medium_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 196 | 226.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_seira_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 196 | 124.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_seira_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 460 | 262.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_seira_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 196 | 196.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_seira_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 460 | 389.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mizuki_seira_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mizuki_seira_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, jewelry, looking_at_viewer, solo, belt, cleavage, mini_hat, short_shorts, smile, black_gloves, crop_top, midriff, thighhighs, lace_trim, navel, black_shorts, cowboy_shot, idol, open_jacket, simple_background, suspender_shorts, white_background, white_jacket, black_headwear, closed_mouth, detached_collar, hair_between_eyes, long_sleeves, standing, stomach, thigh_boots |
| 1 | 14 |  |  |  |  |  | 1girl, solo, midriff, smile, navel, open_mouth, thighhighs, belt, cleavage, skirt, hair_ornament, fingerless_gloves, looking_at_viewer, one_eye_closed, bare_shoulders, blush, boots, bracelet, choker, microphone |
| 2 | 7 |  |  |  |  |  | 1girl, jewelry, open_mouth, smile, solo, blush, dog, one_eye_closed, pants, ;d, looking_at_viewer |
| 3 | 12 |  |  |  |  |  | 1girl, solo, collarbone, jewelry, looking_at_viewer, simple_background, white_background, blush, smile, upper_body, black_shirt, long_sleeves, closed_mouth, open_mouth |
| 4 | 6 |  |  |  |  |  | 1girl, solo, looking_at_viewer, navel, smile, blue_bikini, cleavage, jewelry |
| 5 | 9 |  |  |  |  |  | 1girl, detached_collar, rabbit_ears, solo, wrist_cuffs, bowtie, fake_animal_ears, playboy_bunny, cleavage, looking_at_viewer, bare_shoulders, blush, pantyhose, simple_background, smile, large_breasts, white_background |
| 6 | 5 |  |  |  |  |  | 1girl, bowtie, cleavage, fake_animal_ears, fishnet_thighhighs, garter_straps, looking_at_viewer, midriff, navel, rabbit_ears, solo, bare_shoulders, miniskirt, simple_background, white_background, wrist_cuffs, black_skirt, blush, closed_mouth, crop_top, detached_collar, full_body, hairband, sleeveless, smile, vest, black_footwear, coattails, high_heels, large_breasts, microskirt, rabbit_tail, red_bow, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | looking_at_viewer | solo | belt | cleavage | mini_hat | short_shorts | smile | black_gloves | crop_top | midriff | thighhighs | lace_trim | navel | black_shorts | cowboy_shot | idol | open_jacket | simple_background | suspender_shorts | white_background | white_jacket | black_headwear | closed_mouth | detached_collar | hair_between_eyes | long_sleeves | standing | stomach | thigh_boots | open_mouth | skirt | hair_ornament | fingerless_gloves | one_eye_closed | bare_shoulders | blush | boots | bracelet | choker | microphone | dog | pants | ;d | collarbone | upper_body | black_shirt | blue_bikini | rabbit_ears | wrist_cuffs | bowtie | fake_animal_ears | playboy_bunny | pantyhose | large_breasts | fishnet_thighhighs | garter_straps | miniskirt | black_skirt | full_body | hairband | sleeveless | vest | black_footwear | coattails | high_heels | microskirt | rabbit_tail | red_bow |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------------------|:-------|:-------|:-----------|:-----------|:---------------|:--------|:---------------|:-----------|:----------|:-------------|:------------|:--------|:---------------|:--------------|:-------|:--------------|:--------------------|:-------------------|:-------------------|:---------------|:-----------------|:---------------|:------------------|:--------------------|:---------------|:-----------|:----------|:--------------|:-------------|:--------|:----------------|:--------------------|:-----------------|:-----------------|:--------|:--------|:-----------|:---------|:-------------|:------|:--------|:-----|:-------------|:-------------|:--------------|:--------------|:--------------|:--------------|:---------|:-------------------|:----------------|:------------|:----------------|:---------------------|:----------------|:------------|:--------------|:------------|:-----------|:-------------|:-------|:-----------------|:------------|:-------------|:-------------|:--------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | X | X | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | X | | X | | | X | | | X | | | | X | | | | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | X | | X | | | X | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | | X | | | X | | X | X | | | X | | | | | X | | X | | | X | X | | | | X | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_bartowski__internlm2-math-20b-llama | ---
pretty_name: Evaluation run of bartowski/internlm2-math-20b-llama
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bartowski/internlm2-math-20b-llama](https://huggingface.co/bartowski/internlm2-math-20b-llama)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bartowski__internlm2-math-20b-llama\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T05:51:31.434464](https://huggingface.co/datasets/open-llm-leaderboard/details_bartowski__internlm2-math-20b-llama/blob/main/results_2024-01-25T05-51-31.434464.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6401833310404773,\n\
\ \"acc_stderr\": 0.03215591321198736,\n \"acc_norm\": 0.6526470071520217,\n\
\ \"acc_norm_stderr\": 0.03295626890086434,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5289709889013895,\n\
\ \"mc2_stderr\": 0.015151269954401329\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5452218430034129,\n \"acc_stderr\": 0.014551507060836355,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809176\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6307508464449313,\n\
\ \"acc_stderr\": 0.004816152074023084,\n \"acc_norm\": 0.8163712407886875,\n\
\ \"acc_norm_stderr\": 0.003863898546941598\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810534,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810534\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106134,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082114,\n \
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063434,\n \
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650153,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650153\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240658,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240658\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5407821229050279,\n\
\ \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.5407821229050279,\n\
\ \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032218,\n\
\ \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032218\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"\
acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533197,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533197\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5289709889013895,\n\
\ \"mc2_stderr\": 0.015151269954401329\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275623\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848636\n }\n}\n```"
repo_url: https://huggingface.co/bartowski/internlm2-math-20b-llama
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-51-31.434464.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-51-31.434464.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- '**/details_harness|winogrande|5_2024-01-25T05-51-31.434464.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T05-51-31.434464.parquet'
- config_name: results
data_files:
- split: 2024_01_25T05_51_31.434464
path:
- results_2024-01-25T05-51-31.434464.parquet
- split: latest
path:
- results_2024-01-25T05-51-31.434464.parquet
---
# Dataset Card for Evaluation run of bartowski/internlm2-math-20b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bartowski/internlm2-math-20b-llama](https://huggingface.co/bartowski/internlm2-math-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bartowski__internlm2-math-20b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T05:51:31.434464](https://huggingface.co/datasets/open-llm-leaderboard/details_bartowski__internlm2-math-20b-llama/blob/main/results_2024-01-25T05-51-31.434464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6401833310404773,
"acc_stderr": 0.03215591321198736,
"acc_norm": 0.6526470071520217,
"acc_norm_stderr": 0.03295626890086434,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5289709889013895,
"mc2_stderr": 0.015151269954401329
},
"harness|arc:challenge|25": {
"acc": 0.5452218430034129,
"acc_stderr": 0.014551507060836355,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809176
},
"harness|hellaswag|10": {
"acc": 0.6307508464449313,
"acc_stderr": 0.004816152074023084,
"acc_norm": 0.8163712407886875,
"acc_norm_stderr": 0.003863898546941598
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810534,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810534
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106134,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429903,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429903
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082114,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650153,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650153
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240658,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240658
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5407821229050279,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.5407821229050279,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032218,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032218
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533197,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533197
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5289709889013895,
"mc2_stderr": 0.015151269954401329
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275623
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848636
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
edbeeching/prj_gia_dataset_atari_2B_atari_frostbite_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_frostbite environment, sample for the policy atari_2B_atari_frostbite_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
Harshvardhan27/Wikicorpus_Fine_Tuned_Mistral_1000 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: input_length
dtype: int64
- name: input_prompt
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 3365988
num_examples: 1000
download_size: 2093406
dataset_size: 3365988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sugam11/covid-qa | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: context
dtype: string
- name: document_id
dtype: int64
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 48676376
num_examples: 1417
- name: test
num_bytes: 11614522
num_examples: 375
- name: validation
num_bytes: 4317894
num_examples: 203
download_size: 2252430
dataset_size: 64608792
---
# Dataset Card for "covid-qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Krooz/Campus_Recruitment_Text | ---
license: cc0-1.0
task_categories:
- text-classification
language:
- en
tags:
- education
- university
- placement
pretty_name: Students Records for Placements
size_categories:
- 1K<n<10K
---
## Dataset Description
This data set consists of Placement data of students in a XYZ campus. Based on the student's performance report we are classifying his Placement Status. The dataset is derived from a [csv data](https://huggingface.co/datasets/Krooz/Campus_Recruitment_CSV).
The Mistral7B model is used with data-to-text methodology to convert each of the rows in the csv data into a textual format for the LLM's, the conversion script is in this [notebook](https://github.com/Kirushikesh/Campus_Recruitment_Prediction_LLM/blob/main/data_preprocess.ipynb).
The Prompt field is the prompt used on Mistral7B LLM and the response field is the corresponding text generated which can be used for finetuning the LLMs.
Each of the students report consist of following information,
* CGPA - The grade of the student in his university
* Internships - The no of internship done by the student before final placement
* Projects - The no of projects done by the student
* Workshops/Certifications - The no of workshops attended and the certifications student had
* AptitudeTestScore - The aptitude score the student attained from the exam
* SoftSkillsRating - The soft skill rating attained by the student
* ExtracurricularActivities - Did the student has some extra curricular activities
* PlacementTraining - Did the student got placement training
* SSC_Marks - The senior secondary school marks scored by the student
* HSC_Marks - The higher secondary school marks scored by the student
The label is the PlacementStatus whether the student is Placed or not
## UseCases
This is a large dataset with 10k data items which can be used either for Full-finetuning or Parameter-Efficient-Finetuning as text classification problem.
## Note
The data-to-text conversion is not accurate there are chances of data loss(loss of information that is present in data lost during the text generation), this can be improved by using better prompts or LLMs.
PS: Do give a like if you found the dataset useful :) |
TuringsSolutions/Aphasia500 | ---
license: mit
---
|
open-llm-leaderboard/details_S-miguel__The-Trinity-Coder-7B | ---
pretty_name: Evaluation run of S-miguel/The-Trinity-Coder-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [S-miguel/The-Trinity-Coder-7B](https://huggingface.co/S-miguel/The-Trinity-Coder-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_S-miguel__The-Trinity-Coder-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T10:17:41.109127](https://huggingface.co/datasets/open-llm-leaderboard/details_S-miguel__The-Trinity-Coder-7B/blob/main/results_2024-03-23T10-17-41.109127.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534282218245622,\n\
\ \"acc_stderr\": 0.032087196236594055,\n \"acc_norm\": 0.6533558565004056,\n\
\ \"acc_norm_stderr\": 0.03275467258129072,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6125265483406871,\n\
\ \"mc2_stderr\": 0.01534471270068822\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620196,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6715793666600279,\n\
\ \"acc_stderr\": 0.004686789042445371,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.003445289925011736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\"\
: 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201032,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201032\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250437,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092382,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993455,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993455\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6125265483406871,\n\
\ \"mc2_stderr\": 0.01534471270068822\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267209\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7338893100833965,\n \
\ \"acc_stderr\": 0.012172750939040324\n }\n}\n```"
repo_url: https://huggingface.co/S-miguel/The-Trinity-Coder-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|arc:challenge|25_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|gsm8k|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hellaswag|10_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T10-17-41.109127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T10-17-41.109127.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- '**/details_harness|winogrande|5_2024-03-23T10-17-41.109127.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T10-17-41.109127.parquet'
- config_name: results
data_files:
- split: 2024_03_23T10_17_41.109127
path:
- results_2024-03-23T10-17-41.109127.parquet
- split: latest
path:
- results_2024-03-23T10-17-41.109127.parquet
---
# Dataset Card for Evaluation run of S-miguel/The-Trinity-Coder-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [S-miguel/The-Trinity-Coder-7B](https://huggingface.co/S-miguel/The-Trinity-Coder-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_S-miguel__The-Trinity-Coder-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T10:17:41.109127](https://huggingface.co/datasets/open-llm-leaderboard/details_S-miguel__The-Trinity-Coder-7B/blob/main/results_2024-03-23T10-17-41.109127.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6534282218245622,
"acc_stderr": 0.032087196236594055,
"acc_norm": 0.6533558565004056,
"acc_norm_stderr": 0.03275467258129072,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6125265483406871,
"mc2_stderr": 0.01534471270068822
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620196,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6715793666600279,
"acc_stderr": 0.004686789042445371,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201032,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201032
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250437,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092382,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993455,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993455
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6125265483406871,
"mc2_stderr": 0.01534471270068822
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267209
},
"harness|gsm8k|5": {
"acc": 0.7338893100833965,
"acc_stderr": 0.012172750939040324
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ccdv/arxiv-classification | ---
language: en
task_categories:
- text-classification
tags:
- long context
task_ids:
- multi-class-classification
- topic-classification
size_categories: 10K<n<100K
---
**Arxiv Classification: a classification of Arxiv Papers (11 classes).**
This dataset is intended for long context classification (documents have all > 4k tokens). \
Copied from "Long Document Classification From Local Word Glimpses via Recurrent Attention Learning"
```
@ARTICLE{8675939,
author={He, Jun and Wang, Liqun and Liu, Liu and Feng, Jiao and Wu, Hao},
journal={IEEE Access},
title={Long Document Classification From Local Word Glimpses via Recurrent Attention Learning},
year={2019},
volume={7},
number={},
pages={40707-40718},
doi={10.1109/ACCESS.2019.2907992}
}
```
* See: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8675939
* See: https://github.com/LiqunW/Long-document-dataset
It contains 11 slightly unbalanced classes, 33k Arxiv Papers divided into 3 splits: train (28k), val (2.5k) and test (2.5k).
2 configs:
* default
* no_ref, removes references to the class inside the document (eg: [cs.LG] -> [])
Compatible with [run_glue.py](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) script:
```
export MODEL_NAME=roberta-base
export MAX_SEQ_LENGTH=512
python run_glue.py \
--model_name_or_path $MODEL_NAME \
--dataset_name ccdv/arxiv-classification \
--do_train \
--do_eval \
--max_seq_length $MAX_SEQ_LENGTH \
--per_device_train_batch_size 8 \
--gradient_accumulation_steps 4 \
--learning_rate 2e-5 \
--num_train_epochs 1 \
--max_eval_samples 500 \
--output_dir tmp/arxiv
``` |
Imadken/platypus_masked | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: data_source
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 151193034.49602824
num_examples: 22433
- name: test
num_bytes: 16802221.503971756
num_examples: 2493
download_size: 43272479
dataset_size: 167995256.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ChristianMD/20240323S210 | ---
license: mit
---
|
sehyun66/STOCKPRICE | ---
license: mit
dataset_info:
config_name: NASDAQ_3y
features:
- name: Date
dtype: string
- name: Open
dtype: float64
- name: High
dtype: float64
- name: Low
dtype: float64
- name: Close
dtype: float64
- name: Adj Close
dtype: float64
- name: Volume
dtype: int64
- name: Ticker
dtype: string
splits:
- name: train
num_bytes: 112710930
num_examples: 1612680
download_size: 64458393
dataset_size: 112710930
configs:
- config_name: NASDAQ_3y
data_files:
- split: train
path: NASDAQ_3y/train-*
---
|
cfilt/PUB | ---
license: mit
task_categories:
- question-answering
- zero-shot-classification
- text-classification
- conversational
- text-generation
- text2text-generation
language:
- en
pretty_name: Pragmatics Understanding Benchmark (PUB)
size_categories:
- 10K<n<100K
--- |
rkstgr/mtg-jamendo | ---
license:
- apache-2.0
size_categories:
- 10K<n<100K
source_datasets:
- original
pretty_name: MTG Jamendo
---
# Dataset Card for MTG Jamendo Dataset
## Dataset Description
- **Repository:** [MTG Jamendo dataset repository](https://github.com/MTG/mtg-jamendo-dataset)
### Dataset Summary
MTG-Jamendo Dataset, a new open dataset for music auto-tagging. It is built using music available at Jamendo under Creative Commons licenses and tags provided by content uploaders. The dataset contains over 55,000 full audio tracks with 195 tags from genre, instrument, and mood/theme categories. We provide elaborated data splits for researchers and report the performance of a simple baseline approach on five different sets of tags: genre, instrument, mood/theme, top-50, and overall.
## Dataset structure
### Data Fields
- `id`: an integer containing the id of the track
- `artist_id`: an integer containing the id of the artist
- `album_id`: an integer containing the id of the album
- `duration_in_sec`: duration of the track as a float
- `genres`: list of strings, describing genres the track is assigned to
- `instruments`: list of strings for the main instruments of the track
- `moods`: list of strings, describing the moods the track is assigned to
- `audio`: audio of the track
### Data Splits
This dataset has 2 balanced splits: _train_ (90%) and _validation_ (10%)
### Licensing Information
This dataset version 1.0.0 is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```
@conference {bogdanov2019mtg,
author = "Bogdanov, Dmitry and Won, Minz and Tovstogan, Philip and Porter, Alastair and Serra, Xavier",
title = "The MTG-Jamendo Dataset for Automatic Music Tagging",
booktitle = "Machine Learning for Music Discovery Workshop, International Conference on Machine Learning (ICML 2019)",
year = "2019",
address = "Long Beach, CA, United States",
url = "http://hdl.handle.net/10230/42015"
}
``` |
open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties | ---
pretty_name: Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tuantran1632001/Psyfighter2-Orca2-13B-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T15:05:15.491956](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties/blob/main/results_2024-01-13T15-05-15.491956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.603383654987928,\n\
\ \"acc_stderr\": 0.03303267584618269,\n \"acc_norm\": 0.607142680047232,\n\
\ \"acc_norm_stderr\": 0.033700954867739115,\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n\
\ \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225403,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6296554471220872,\n\
\ \"acc_stderr\": 0.00481910045686781,\n \"acc_norm\": 0.8173670583549094,\n\
\ \"acc_norm_stderr\": 0.0038557568514415437\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278008,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278008\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462826,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462826\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458026,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458026\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290923,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290923\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862557,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862557\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n\
\ \"acc_stderr\": 0.015901432608930354,\n \"acc_norm\": 0.3452513966480447,\n\
\ \"acc_norm_stderr\": 0.015901432608930354\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n\
\ \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n\
\ \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866356,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106567,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106567\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.01708779588176963,\n \"mc2\": 0.5540489547722205,\n\
\ \"mc2_stderr\": 0.01582448369078134\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \
\ \"acc_stderr\": 0.013661649780905488\n }\n}\n```"
repo_url: https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-05-15.491956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- '**/details_harness|winogrande|5_2024-01-13T15-05-15.491956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T15-05-15.491956.parquet'
- config_name: results
data_files:
- split: 2024_01_13T15_05_15.491956
path:
- results_2024-01-13T15-05-15.491956.parquet
- split: latest
path:
- results_2024-01-13T15-05-15.491956.parquet
---
# Dataset Card for Evaluation run of tuantran1632001/Psyfighter2-Orca2-13B-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tuantran1632001/Psyfighter2-Orca2-13B-ties](https://huggingface.co/tuantran1632001/Psyfighter2-Orca2-13B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:05:15.491956](https://huggingface.co/datasets/open-llm-leaderboard/details_tuantran1632001__Psyfighter2-Orca2-13B-ties/blob/main/results_2024-01-13T15-05-15.491956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.603383654987928,
"acc_stderr": 0.03303267584618269,
"acc_norm": 0.607142680047232,
"acc_norm_stderr": 0.033700954867739115,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5540489547722205,
"mc2_stderr": 0.01582448369078134
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225403,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.01415063143511173
},
"harness|hellaswag|10": {
"acc": 0.6296554471220872,
"acc_stderr": 0.00481910045686781,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415437
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278008,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278008
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878948,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878948
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458026,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458026
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290923,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862557,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862557
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.015901432608930354,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.015901432608930354
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866356,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106567,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106567
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.01708779588176963,
"mc2": 0.5540489547722205,
"mc2_stderr": 0.01582448369078134
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091088
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905488
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DBQ/Mr.Porter.Product.prices.United.States | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United States - Mr Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Mr Porter
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 11724890
num_examples: 35741
download_size: 2718002
dataset_size: 11724890
---
# Mr Porter web scraped data
## About the website
Mr Porter operates within the **e-commerce industry**, specifically within the **mens luxury fashion** segment, in the United States. This industry has consistently demonstrated strong growth throughout the Americas, particularly in the United States where online retail is booming. The ability to purchase high-end fashion items online has revolutionized how American consumers shop, making upscale fashion more accessible. This specific dataset provides **Ecommerce product-list page (PLP) data** on Mr Porter, which serves as a key insight into customer preferences, popular items, and overall market trends in the United States. Digital marketing techniques have strengthened this shopping medium, making data collection easier and more effective, thereby boosting this sectors potential for profitability.
## Link to **dataset**
[United States - Mr Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Mr%20Porter%20Product-prices%20United%20States/r/recPOcpE0GqT6uLVc)
|
alvarobartt/zephyr-7b-beta-judgelm-new-test | ---
dataset_info:
features:
- name: source
dtype: string
- name: input
dtype: string
- name: models
sequence: string
- name: completions
list:
- name: annotations
struct:
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Type
sequence: string
- name: Rationale
dtype: string
- name: Rating
dtype: string
- name: Rationale For Rating
dtype: string
- name: helpfulness
struct:
- name: Type
sequence: string
- name: Rationale
dtype: string
- name: Rating
dtype: string
- name: Rationale For Rating
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: principle
dtype: string
- name: response
dtype: string
- name: critique
dtype: string
- name: overall_score
dtype: float64
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: generation_model
dtype: string
- name: generation_prompt
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: ratings
sequence: int64
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 61495
num_examples: 2
download_size: 107464
dataset_size: 61495
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zephyr-7b-beta-judgelm-new-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
leeseeun/tokenized_news_2gb_4096 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2228931880
num_examples: 136010
download_size: 0
dataset_size: 2228931880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tokenized_news_2gb_4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anshulsc/Aurelius_Magic_75k | ---
dataset_info:
features:
- name: code
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 173676201
num_examples: 75197
download_size: 69971364
dataset_size: 173676201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bablu75/test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5525
num_examples: 9
download_size: 5860
dataset_size: 5525
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
boapps/alpaca_hu_mt | ---
license: apache-2.0
task_categories:
- text-generation
language:
- hu
---
Magyar nyelvű multi-turn adathalmaz gpt modellek finomhangolásához.
Gemini modellel lett generálva a Gemini alpaca válaszaiból. |
autoevaluate/autoeval-eval-futin__feed-sen_en-2f01d7-2175769985 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-66b
metrics: []
dataset_name: futin/feed
dataset_config: sen_en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-66b
* Dataset: futin/feed
* Config: sen_en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
roa7n/patched_test_p_20_m1_predictions_v3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: m1_preds
dtype: float32
splits:
- name: train
num_bytes: 1534784568
num_examples: 2775054
download_size: 135840078
dataset_size: 1534784568
---
# Dataset Card for "patched_test_p_20_m1_predictions_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-356m | ---
pretty_name: Evaluation run of AI-Sweden-Models/gpt-sw3-356m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AI-Sweden-Models/gpt-sw3-356m](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-356m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-06T17:48:56.014901](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-356m/blob/main/results_2023-12-06T17-48-56.014901.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25979396469353283,\n\
\ \"acc_stderr\": 0.030934747009693227,\n \"acc_norm\": 0.26083135270282387,\n\
\ \"acc_norm_stderr\": 0.03173400312275678,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752325,\n \"mc2\": 0.4254820499111399,\n\
\ \"mc2_stderr\": 0.014746028552389436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21245733788395904,\n \"acc_stderr\": 0.011953482906582947,\n\
\ \"acc_norm\": 0.2363481228668942,\n \"acc_norm_stderr\": 0.012414960524301836\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3201553475403306,\n\
\ \"acc_stderr\": 0.0046558259808920175,\n \"acc_norm\": 0.37054371639115713,\n\
\ \"acc_norm_stderr\": 0.004819633668832553\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123387,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123387\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727772,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727772\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2032258064516129,\n\
\ \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.2032258064516129,\n\
\ \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.03031509928561773,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.03031509928561773\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.03308818594415751,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.03308818594415751\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780305,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780305\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842565,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n\
\ \"acc_stderr\": 0.015133383278988841,\n \"acc_norm\": 0.23371647509578544,\n\
\ \"acc_norm_stderr\": 0.015133383278988841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.021152676966575298,\n\
\ \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.021152676966575298\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034952,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034952\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752325,\n \"mc2\": 0.4254820499111399,\n\
\ \"mc2_stderr\": 0.014746028552389436\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5303867403314917,\n \"acc_stderr\": 0.014026510839428743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674313\n }\n}\n```"
repo_url: https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|arc:challenge|25_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|gsm8k|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hellaswag|10_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-48-56.014901.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T17-48-56.014901.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- '**/details_harness|winogrande|5_2023-12-06T17-48-56.014901.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-06T17-48-56.014901.parquet'
- config_name: results
data_files:
- split: 2023_12_06T17_48_56.014901
path:
- results_2023-12-06T17-48-56.014901.parquet
- split: latest
path:
- results_2023-12-06T17-48-56.014901.parquet
---
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-356m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-356m](https://huggingface.co/AI-Sweden-Models/gpt-sw3-356m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-356m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-06T17:48:56.014901](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-356m/blob/main/results_2023-12-06T17-48-56.014901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25979396469353283,
"acc_stderr": 0.030934747009693227,
"acc_norm": 0.26083135270282387,
"acc_norm_stderr": 0.03173400312275678,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752325,
"mc2": 0.4254820499111399,
"mc2_stderr": 0.014746028552389436
},
"harness|arc:challenge|25": {
"acc": 0.21245733788395904,
"acc_stderr": 0.011953482906582947,
"acc_norm": 0.2363481228668942,
"acc_norm_stderr": 0.012414960524301836
},
"harness|hellaswag|10": {
"acc": 0.3201553475403306,
"acc_stderr": 0.0046558259808920175,
"acc_norm": 0.37054371639115713,
"acc_norm_stderr": 0.004819633668832553
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.03031509928561773,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.03031509928561773
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780305,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780305
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842565,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.015133383278988841,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.015133383278988841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.021152676966575298,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.021152676966575298
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503793,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034952,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034952
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752325,
"mc2": 0.4254820499111399,
"mc2_stderr": 0.014746028552389436
},
"harness|winogrande|5": {
"acc": 0.5303867403314917,
"acc_stderr": 0.014026510839428743
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674313
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
phyloforfun/HLT_MICH_Angiospermae_SLTPvA_v1-0_medium__OCR-C25-L25-E50-R05 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16788522
num_examples: 10001
download_size: 3119218
dataset_size: 16788522
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dim/chip2_instruct_alpha_prompt_en | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 85102023
num_examples: 210289
download_size: 50192027
dataset_size: 85102023
---
# Dataset Card for "chip2_instruct_alpha_prompt_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pedromigurasdev/autotrain-data-autotrain-jose-antorcha-22 | ---
dataset_info:
features:
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 555000
num_examples: 840
- name: validation
num_bytes: 555000
num_examples: 840
download_size: 84992
dataset_size: 1110000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-autotrain-jose-antorcha-22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AiresPucrs/sentiment-analysis | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 71842095
num_examples: 85089
download_size: 44486982
dataset_size: 71842095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
---
# Sentiment Analysis dataset
This dataset is a concatenation of the [`IMDB 50K`](https://www.kaggle.com/datasets/lakshmi25npathi/imdb-dataset-of-50k-movie-reviews?select=IMDB+Dataset.csv), the [`Twitter US Airline Sentiment`](https://www.kaggle.com/datasets/crowdflower/twitter-airline-sentiment), App Reviews scraped from [Google Play](https://github.com/Nkluge-correa/teeny-tiny_castle/blob/master/ML%20Explainability/NLP%20Interpreter/text_scraping.ipynb), and the [`EcoPreprocessed`](https://www.kaggle.com/datasets/pradeeshprabhakar/preprocessed-dataset-sentiment-analysis).
## Overview
## Dataset Details
**Citation:**
```latex
```
## Contents
The dataset consists of a data frame with the following columns:
-**text:**
-**label:**
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/sentiment-analysis", split='train')
```
## License
This dataset is licensed under the Apache License 2.0. |
NeelNanda/c4-code-20k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 101351288
num_examples: 20000
download_size: 42778874
dataset_size: 101351288
---
# Dataset Card for "c4-code-10k"
10K elements of C4 and 10K elements of code parrot clean (Python code).
Note that these are the datasets used to train my interpretability-friendly models, but is *not* of the correct mixture. Those models were trained on 83% C4 and 17% Python Code (ish) by tokens. This dataset has 10K strings of each, and by tokens is about 22M of code and 5M of C4 (code is longer and harder to compress!)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ngxingyu/iwslt17_google_trans_scores_sentiments | ---
dataset_info:
features:
- name: bleurt_score
dtype: float64
- name: comet_score
dtype: float64
- name: en
dtype: string
- name: google_zh
dtype: string
- name: zh
dtype: string
- name: en_sentiment
sequence: float32
- name: zh_sentiment
sequence: float32
splits:
- name: train
num_bytes: 76348903
num_examples: 229736
- name: validation
num_bytes: 330850
num_examples: 875
- name: test
num_bytes: 2770955
num_examples: 8549
download_size: 56078469
dataset_size: 79450708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
llmware/rag_instruct_test_dataset_0.1 | ---
license: apache-2.0
tags:
- finance
- legal
pretty_name: RAG Instruct Test Dataset - Basic - v0.1
---
# Dataset Card for RAG-Instruct-Test-Dataset
### Dataset Summary
This is a test dataset for basic "retrieval augmented generation" (RAG) use cases in the enterprise, especially for finance and legal. This test dataset includes 100 samples with context passages pulled from common 'retrieval scenarios', e.g., financial news, earnings releases,
contracts, invoices, technical articles, general news and short texts. The primary use case is to evaluate the effectiveness of an
instruct-fine-tuned LLM used in conjunction with closed-context, fact-based question-answering, key-value extraction, and summarization with bulletpoints. The context passages are relatively short in this test-set ranging from ~100 tokens to ~500 tokens, and was designed for use with the
BLING series of models but is suitable for comparison evaluations of any LLM for basic RAG scenarios.
### **PERFORMANCE on BASIC RAG TEST DATASET**
| Model | Params (B) | Sourcing | GPU/CPU | Output Tokens | Out as % of Input | Process Time (secs) | Score (0-100) |
| :---------- | :--------: | :----: | :-----: | :---------: | :-------: | :--------: | :-------: |
| gpt-4 | <=1000 | Closed | Multi-GPU | 2665 | 10.53% | 183.8 | 100 |
| gpt-3.5-turbo-instruct| <=175 | Closed | Multi-GPU | 2621 | 11.49% | 62.7 | 100 |
| claude-instant-v1 | <=50 | Closed | Multi-GPU | 6337 | 26.50% | 154 | 100 |
| aib-read-gpt | 7 | Closed | GPU | 1964 | 9.30% | 114 | 96 |
| bling_falcon-1b-0.1 | 1.3 | Open | CPU | 3204 | 14.55% | 696 | 77 |
| bling_pythia-1.4b-0.1 | 1.4 | Open | CPU | 2589 | 11.75% | 593.5 | 65 |
| bling_pythia-1b-0.1 | 1.0 | Open | CPU | 2753 | 12.49% | 428 | 59 |
| bling_cerebras-1.3b | 1.3 | Open | CPU | 3202 | 20.01% | 690.1 | 52 |
| bling_pythia_410m | 0.41 | NA | CPU | 2349 | 10.66% | 189 | 36 |
| bling_cerebras_590m | 0.59 | NA | CPU | 4407 | 20.01% | 400.8 | 30 |
Please check out our [BLOG](https://medium.com/@darrenoberst/evaluating-llm-performance-in-rag-instruct-use-cases-083dc272a31d) with more details, commentary and comparative results testing with this dataset.
We will be enhancing the test dataset as well as creating more advanced test datasets in the future.
### Languages
English
## Dataset Structure
100 JSONL samples with 4 keys - "query" | "context" | "answer" | "sample_number"
### Personal and Sensitive Information
The dataset samples were written bespoke for this objective, but do rely upon some public information, including major public figures and widely reported events.
Any other names were created/masked and any overlap with real companies or people is coincidental.
## Dataset Card Contact
Darren Oberst & llmware team
Please reach out anytime if you are interested in this project and would like to participate and work with us!
|
jigarsiddhpura/IPD | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="jigarsiddhpura/IPD" src="https://huggingface.co/datasets/jigarsiddhpura/IPD/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['dry-person', 'object', 'wet-swimmer']
```
### Number of Images
```json
{'test': 77, 'valid': 153, 'train': 1608}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("jigarsiddhpura/IPD", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/resq/tiny-people-detection-rpi/dataset/1](https://universe.roboflow.com/resq/tiny-people-detection-rpi/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ tiny-people-detection-rpi_dataset,
title = { Tiny people detection RPI Dataset },
type = { Open Source Dataset },
author = { ResQ },
howpublished = { \\url{ https://universe.roboflow.com/resq/tiny-people-detection-rpi } },
url = { https://universe.roboflow.com/resq/tiny-people-detection-rpi },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2023 },
month = { sep },
note = { visited on 2024-02-11 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on February 10, 2024 at 7:28 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 1838 images.
People are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
The following augmentation was applied to create 3 versions of each source image:
* Randomly crop between 0 and 67 percent of the image
* Salt and pepper noise was applied to 4 percent of pixels
The following transformations were applied to the bounding boxes of each image:
* Random shear of between -5° to +5° horizontally and -5° to +5° vertically
|
GGINCoder/llfinetunung | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 764677
num_examples: 500
download_size: 451664
dataset_size: 764677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ZhenbinWang/pelvis | ---
license: unlicense
---
|
vrish/ads-dpo-top1000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: id
dtype: int64
- name: title
dtype: string
- name: year
dtype: int64
- name: abstract
dtype: string
- name: read_count
dtype: int64
- name: cite_read_boost
dtype: float64
- name: citation_count
dtype: int64
- name: clickbait
dtype: int64
- name: __index_level_0__
dtype: int64
- name: selected_title
dtype: string
- name: rejected_title
dtype: string
splits:
- name: train
num_bytes: 1481849
num_examples: 1000
download_size: 842517
dataset_size: 1481849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dewithsan/secop_corpus_clean | ---
dataset_info:
features:
- name: id_doc
dtype: string
- name: doc_text
dtype: string
splits:
- name: train
num_bytes: 136440633
num_examples: 10934
download_size: 67753697
dataset_size: 136440633
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_160m_thr_1.0_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43779189
num_examples: 18928
- name: epoch_1
num_bytes: 44389394
num_examples: 18928
- name: epoch_2
num_bytes: 44452838
num_examples: 18928
- name: epoch_3
num_bytes: 44504767
num_examples: 18928
- name: epoch_4
num_bytes: 44525045
num_examples: 18928
- name: epoch_5
num_bytes: 44512817
num_examples: 18928
- name: epoch_6
num_bytes: 44504063
num_examples: 18928
- name: epoch_7
num_bytes: 44468006
num_examples: 18928
- name: epoch_8
num_bytes: 44457766
num_examples: 18928
- name: epoch_9
num_bytes: 44452281
num_examples: 18928
- name: epoch_10
num_bytes: 44441633
num_examples: 18928
- name: epoch_11
num_bytes: 44440113
num_examples: 18928
- name: epoch_12
num_bytes: 44438738
num_examples: 18928
- name: epoch_13
num_bytes: 44437694
num_examples: 18928
- name: epoch_14
num_bytes: 44438979
num_examples: 18928
- name: epoch_15
num_bytes: 44434772
num_examples: 18928
- name: epoch_16
num_bytes: 44431885
num_examples: 18928
- name: epoch_17
num_bytes: 44430771
num_examples: 18928
- name: epoch_18
num_bytes: 44430902
num_examples: 18928
- name: epoch_19
num_bytes: 44429917
num_examples: 18928
- name: epoch_20
num_bytes: 44430629
num_examples: 18928
- name: epoch_21
num_bytes: 44429778
num_examples: 18928
- name: epoch_22
num_bytes: 44429225
num_examples: 18928
- name: epoch_23
num_bytes: 44432672
num_examples: 18928
- name: epoch_24
num_bytes: 44429439
num_examples: 18928
- name: epoch_25
num_bytes: 44429477
num_examples: 18928
- name: epoch_26
num_bytes: 44429378
num_examples: 18928
- name: epoch_27
num_bytes: 44429813
num_examples: 18928
- name: epoch_28
num_bytes: 44426534
num_examples: 18928
- name: epoch_29
num_bytes: 44428539
num_examples: 18928
download_size: 700085000
dataset_size: 1332697054
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
open-llm-leaderboard/details_ShadowFall09__tyc_test1 | ---
pretty_name: Evaluation run of ShadowFall09/FANNO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ShadowFall09/FANNO](https://huggingface.co/ShadowFall09/FANNO) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ShadowFall09__FANNO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:11:57.870281](https://huggingface.co/datasets/open-llm-leaderboard/details_ShadowFall09__FANNO/blob/main/results_2024-03-24T15-11-57.870281.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46829493868997685,\n\
\ \"acc_stderr\": 0.03446403369721223,\n \"acc_norm\": 0.4728706098467404,\n\
\ \"acc_norm_stderr\": 0.03524398659366996,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5204596837543014,\n\
\ \"mc2_stderr\": 0.015375758554330876\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5546075085324232,\n \"acc_norm_stderr\": 0.014523987638344078\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6021708822943637,\n\
\ \"acc_stderr\": 0.004884495069459695,\n \"acc_norm\": 0.7928699462258514,\n\
\ \"acc_norm_stderr\": 0.004044213304049373\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47096774193548385,\n\
\ \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.47096774193548385,\n\
\ \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.034474782864143565,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.034474782864143565\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6238532110091743,\n\
\ \"acc_stderr\": 0.02076923196820508,\n \"acc_norm\": 0.6238532110091743,\n\
\ \"acc_norm_stderr\": 0.02076923196820508\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n\
\ \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5343137254901961,\n \"acc_stderr\": 0.03501038327635897,\n \"\
acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.03501038327635897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343119,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n\
\ \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.6462324393358876,\n\
\ \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808838,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808838\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325956,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5154320987654321,\n \"acc_stderr\": 0.027807490044276198,\n\
\ \"acc_norm\": 0.5154320987654321,\n \"acc_norm_stderr\": 0.027807490044276198\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.012228645537277568,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.012228645537277568\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4542483660130719,\n \"acc_stderr\": 0.020142974553795198,\n \
\ \"acc_norm\": 0.4542483660130719,\n \"acc_norm_stderr\": 0.020142974553795198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5204596837543014,\n\
\ \"mc2_stderr\": 0.015375758554330876\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1463229719484458,\n \
\ \"acc_stderr\": 0.00973521055778525\n }\n}\n```"
repo_url: https://huggingface.co/ShadowFall09/FANNO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-11-57.870281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-11-57.870281.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- '**/details_harness|winogrande|5_2024-03-24T15-11-57.870281.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-11-57.870281.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_11_57.870281
path:
- results_2024-03-24T15-11-57.870281.parquet
- split: latest
path:
- results_2024-03-24T15-11-57.870281.parquet
---
# Dataset Card for Evaluation run of ShadowFall09/FANNO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ShadowFall09/FANNO](https://huggingface.co/ShadowFall09/FANNO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ShadowFall09__FANNO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:11:57.870281](https://huggingface.co/datasets/open-llm-leaderboard/details_ShadowFall09__FANNO/blob/main/results_2024-03-24T15-11-57.870281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46829493868997685,
"acc_stderr": 0.03446403369721223,
"acc_norm": 0.4728706098467404,
"acc_norm_stderr": 0.03524398659366996,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5204596837543014,
"mc2_stderr": 0.015375758554330876
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5546075085324232,
"acc_norm_stderr": 0.014523987638344078
},
"harness|hellaswag|10": {
"acc": 0.6021708822943637,
"acc_stderr": 0.004884495069459695,
"acc_norm": 0.7928699462258514,
"acc_norm_stderr": 0.004044213304049373
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970187,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970187
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.034474782864143565,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.034474782864143565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6238532110091743,
"acc_stderr": 0.02076923196820508,
"acc_norm": 0.6238532110091743,
"acc_norm_stderr": 0.02076923196820508
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343119,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6462324393358876,
"acc_stderr": 0.017098184708161903,
"acc_norm": 0.6462324393358876,
"acc_norm_stderr": 0.017098184708161903
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808838,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808838
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325956,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.027807490044276198,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.027807490044276198
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.012228645537277568,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.012228645537277568
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4542483660130719,
"acc_stderr": 0.020142974553795198,
"acc_norm": 0.4542483660130719,
"acc_norm_stderr": 0.020142974553795198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5204596837543014,
"mc2_stderr": 0.015375758554330876
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.1463229719484458,
"acc_stderr": 0.00973521055778525
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4 | ---
pretty_name: Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T06:55:06.084057](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4/blob/main/results_2023-10-18T06-55-06.084057.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826871,\n \"f1\": 0.058036912751678014,\n\
\ \"f1_stderr\": 0.001339441597906354,\n \"acc\": 0.3295242323804896,\n\
\ \"acc_stderr\": 0.008622843965649133\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826871,\n\
\ \"f1\": 0.058036912751678014,\n \"f1_stderr\": 0.001339441597906354\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \
\ \"acc_stderr\": 0.0037560783410314704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6400947119179163,\n \"acc_stderr\": 0.013489609590266795\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T06_55_06.084057
path:
- '**/details_harness|drop|3_2023-10-18T06-55-06.084057.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T06-55-06.084057.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T06_55_06.084057
path:
- '**/details_harness|gsm8k|5_2023-10-18T06-55-06.084057.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T06-55-06.084057.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:54:40.304544.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:54:40.304544.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T06_55_06.084057
path:
- '**/details_harness|winogrande|5_2023-10-18T06-55-06.084057.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T06-55-06.084057.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_54_40.304544
path:
- results_2023-07-19T15:54:40.304544.parquet
- split: 2023_10_18T06_55_06.084057
path:
- results_2023-10-18T06-55-06.084057.parquet
- split: latest
path:
- results_2023-10-18T06-55-06.084057.parquet
---
# Dataset Card for Evaluation run of TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4](https://huggingface.co/TehVenom/GPT-J-Pyg_PPO-6B-Dev-V8p4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T06:55:06.084057](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__GPT-J-Pyg_PPO-6B-Dev-V8p4/blob/main/results_2023-10-18T06-55-06.084057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826871,
"f1": 0.058036912751678014,
"f1_stderr": 0.001339441597906354,
"acc": 0.3295242323804896,
"acc_stderr": 0.008622843965649133
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826871,
"f1": 0.058036912751678014,
"f1_stderr": 0.001339441597906354
},
"harness|gsm8k|5": {
"acc": 0.018953752843062926,
"acc_stderr": 0.0037560783410314704
},
"harness|winogrande|5": {
"acc": 0.6400947119179163,
"acc_stderr": 0.013489609590266795
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ds4sd/USPTO-30K | ---
dataset_info:
features:
- name: filename
dtype: string
- name: image
dtype: image
- name: mol
dtype: string
splits:
- name: clean
num_bytes: 88030343.0
num_examples: 10000
- name: abbreviated
num_bytes: 84064086.0
num_examples: 10000
- name: large
num_bytes: 238905697.0
num_examples: 10000
download_size: 291334748
dataset_size: 411000126.0
---
# USPTO-30K
USPTO-30K is the benchmark dataset introduced in [MolGrapher: Graph-based Visual Recognition of Chemical Structures](https://github.com/DS4SD/MolGrapher).
Existing benchmarks for Optical Chemical Structure Recognition have some limitations.
Being created using only a few documents, they contain batches of very similar molecules. For example in a patent, a molecule could typically be displayed together with all the substituent of one particular substructure, resulting in large batches of almost identical molecules.
Additionally, the existing sets contain molecules of different kinds, including superatom groups and various markush features, which should be evaluated independently.
In practice, it is important to delimit on which types of molecules models can be applied.
We introduce USPTO-30K, a large-scale benchmark dataset of annotated molecule images, which overcomes these limitations.
It is created using the pairs of images and MolFiles by the United States Patent and Trademark Office.
Each molecule was independently selected among all the available documents from 2001 to 2020.
The set consists of three subsets to decouple the study of clean molecules, molecules with abbreviations and large molecules.
- USPTO-10K contains 10,000 clean molecules, i.e. without any abbreviated groups.
- USPTO-10K-abb contains 10,000 molecules with superatom groups.
- USPTO-10K-L contains 10,000 clean molecules with more than 70 atoms.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Caosgalaxy/IA_VOICES | ---
license: unknown
---
|
IamV/fluent_slu_v1.0 | ---
dataset_info:
features:
- name: speakerId
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: action
dtype: string
- name: object
dtype: string
- name: location
dtype: string
splits:
- name: train
num_bytes: 1699942450.0
num_examples: 23132
- name: validation
num_bytes: 225199474.0
num_examples: 3118
- name: test
num_bytes: 297556687.0
num_examples: 3793
download_size: 1186447441
dataset_size: 2222698611.0
---
# Dataset Card for "fluent_slu_v1.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadupstack-gis | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 61244
num_examples: 885
- name: corpus
num_bytes: 36704924
num_examples: 37637
download_size: 20083359
dataset_size: 36766168
---
# Dataset Card for "cqadupstack-gis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tals/vitaminc | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
pretty_name: VitaminC
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- text-classification
task_ids:
- fact-checking
- natural-language-inference
---
# Details
Fact Verification dataset created for [Get Your Vitamin C! Robust Fact Verification with Contrastive Evidence](https://aclanthology.org/2021.naacl-main.52/) (Schuster et al., NAACL 21`) based on Wikipedia edits (revisions).
For more details see: https://github.com/TalSchuster/VitaminC
When using this dataset, please cite the paper:
# BibTeX entry and citation info
```bibtex
@inproceedings{schuster-etal-2021-get,
title = "Get Your Vitamin {C}! Robust Fact Verification with Contrastive Evidence",
author = "Schuster, Tal and
Fisch, Adam and
Barzilay, Regina",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.naacl-main.52",
doi = "10.18653/v1/2021.naacl-main.52",
pages = "624--643",
abstract = "Typical fact verification models use retrieved written evidence to verify claims. Evidence sources, however, often change over time as more information is gathered and revised. In order to adapt, models must be sensitive to subtle differences in supporting evidence. We present VitaminC, a benchmark infused with challenging cases that require fact verification models to discern and adjust to slight factual changes. We collect over 100,000 Wikipedia revisions that modify an underlying fact, and leverage these revisions, together with additional synthetically constructed ones, to create a total of over 400,000 claim-evidence pairs. Unlike previous resources, the examples in VitaminC are contrastive, i.e., they contain evidence pairs that are nearly identical in language and content, with the exception that one supports a given claim while the other does not. We show that training using this design increases robustness{---}improving accuracy by 10{\%} on adversarial fact verification and 6{\%} on adversarial natural language inference (NLI). Moreover, the structure of VitaminC leads us to define additional tasks for fact-checking resources: tagging relevant words in the evidence for verifying the claim, identifying factual revisions, and providing automatic edits via factually consistent text generation.",
}
``` |
Hikam22/ReviewDataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped | ---
pretty_name: Evaluation run of EleutherAI/pythia-2.8b-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/pythia-2.8b-deduped](https://huggingface.co/EleutherAI/pythia-2.8b-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T02:23:42.600907](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped/blob/main/results_2023-10-22T02-23-42.600907.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931190916,\n \"f1\": 0.0446549916107384,\n\
\ \"f1_stderr\": 0.0011620582208289672,\n \"acc\": 0.30527479800116447,\n\
\ \"acc_stderr\": 0.008130342870304771\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190916,\n\
\ \"f1\": 0.0446549916107384,\n \"f1_stderr\": 0.0011620582208289672\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.002504942226860519\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749023\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/pythia-2.8b-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T02_23_42.600907
path:
- '**/details_harness|drop|3_2023-10-22T02-23-42.600907.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T02-23-42.600907.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T02_23_42.600907
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-23-42.600907.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-23-42.600907.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:26:01.712520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:26:01.712520.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T02_23_42.600907
path:
- '**/details_harness|winogrande|5_2023-10-22T02-23-42.600907.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T02-23-42.600907.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_26_01.712520
path:
- results_2023-07-19T17:26:01.712520.parquet
- split: 2023_10_22T02_23_42.600907
path:
- results_2023-10-22T02-23-42.600907.parquet
- split: latest
path:
- results_2023-10-22T02-23-42.600907.parquet
---
# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-2.8b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-2.8b-deduped](https://huggingface.co/EleutherAI/pythia-2.8b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T02:23:42.600907](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped/blob/main/results_2023-10-22T02-23-42.600907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190916,
"f1": 0.0446549916107384,
"f1_stderr": 0.0011620582208289672,
"acc": 0.30527479800116447,
"acc_stderr": 0.008130342870304771
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190916,
"f1": 0.0446549916107384,
"f1_stderr": 0.0011620582208289672
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860519
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749023
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
poorguys/chinese_fonts_single_128x128 | ---
dataset_info:
features:
- name: image
dtype: image
- name: char
dtype: string
- name: unicode
dtype: string
- name: font
dtype: string
- name: font_type
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 244349.0
num_examples: 65
download_size: 239164
dataset_size: 244349.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chinese_fonts_single_128x128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DrNicefellow__Mistral-2-from-Mixtral-8x7B-v0.1 | ---
pretty_name: Evaluation run of DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DrNicefellow__Mistral-2-from-Mixtral-8x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T14:20:57.186857](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-2-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T14-20-57.186857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24196363480520452,\n\
\ \"acc_stderr\": 0.03023773260514397,\n \"acc_norm\": 0.24282209613422964,\n\
\ \"acc_norm_stderr\": 0.031047522450097502,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731604,\n \"mc2\": 0.4837259674197354,\n\
\ \"mc2_stderr\": 0.016188666928951607\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448066,\n\
\ \"acc_norm\": 0.2841296928327645,\n \"acc_norm_stderr\": 0.013179442447653887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25871340370444135,\n\
\ \"acc_stderr\": 0.00437032822483179,\n \"acc_norm\": 0.26488747261501694,\n\
\ \"acc_norm_stderr\": 0.0044037143273799075\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123394,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123394\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.024618298195866507,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.024618298195866507\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03123475237772117,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20202020202020202,\n\
\ \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.20202020202020202,\n\
\ \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21100917431192662,\n \"acc_stderr\": 0.017493922404112648,\n \"\
acc_norm\": 0.21100917431192662,\n \"acc_norm_stderr\": 0.017493922404112648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431183,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069374,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \"\
acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n\
\ \"acc_stderr\": 0.011176923719313394,\n \"acc_norm\": 0.258148631029987,\n\
\ \"acc_norm_stderr\": 0.011176923719313394\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487424,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487424\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20816326530612245,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.20816326530612245,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731604,\n \"mc2\": 0.4837259674197354,\n\
\ \"mc2_stderr\": 0.016188666928951607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.014045826789783656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-20-57.186857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T14-20-57.186857.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- '**/details_harness|winogrande|5_2024-04-15T14-20-57.186857.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T14-20-57.186857.parquet'
- config_name: results
data_files:
- split: 2024_04_15T14_20_57.186857
path:
- results_2024-04-15T14-20-57.186857.parquet
- split: latest
path:
- results_2024-04-15T14-20-57.186857.parquet
---
# Dataset Card for Evaluation run of DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-2-from-Mixtral-8x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DrNicefellow__Mistral-2-from-Mixtral-8x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T14:20:57.186857](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-2-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T14-20-57.186857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24196363480520452,
"acc_stderr": 0.03023773260514397,
"acc_norm": 0.24282209613422964,
"acc_norm_stderr": 0.031047522450097502,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731604,
"mc2": 0.4837259674197354,
"mc2_stderr": 0.016188666928951607
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448066,
"acc_norm": 0.2841296928327645,
"acc_norm_stderr": 0.013179442447653887
},
"harness|hellaswag|10": {
"acc": 0.25871340370444135,
"acc_stderr": 0.00437032822483179,
"acc_norm": 0.26488747261501694,
"acc_norm_stderr": 0.0044037143273799075
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123394,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2,
"acc_stderr": 0.024618298195866507,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024618298195866507
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21100917431192662,
"acc_stderr": 0.017493922404112648,
"acc_norm": 0.21100917431192662,
"acc_norm_stderr": 0.017493922404112648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431183,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069374,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729903,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729903
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313394,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313394
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487424,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487424
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322256,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20816326530612245,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.20816326530612245,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731604,
"mc2": 0.4837259674197354,
"mc2_stderr": 0.016188666928951607
},
"harness|winogrande|5": {
"acc": 0.5153906866614049,
"acc_stderr": 0.014045826789783656
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.