datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
snake26/FastFoodData | ---
dataset_info:
features:
- name: text
dtype: string
- name: score
dtype: int64
splits:
- name: train
num_bytes: 450884
num_examples: 3009
- name: validation
num_bytes: 49744
num_examples: 376
- name: test
num_bytes: 32035
num_examples: 377
download_size: 319464
dataset_size: 532663
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Falah/1M_luxury_yacht_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 884352230
num_examples: 1000000
download_size: 86298384
dataset_size: 884352230
---
# Dataset Card for "1M_luxury_yacht_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-conll2003-conll2003-623e8b-1865063750 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: dslim/bert-large-NER
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: dslim/bert-large-NER
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@rdecoupes](https://huggingface.co/rdecoupes) for evaluating this model. |
Deathspike/magical-girl-lyrical-nanoha-strikers | ---
license: cc-by-nc-sa-4.0
---
|
frankier/cross_domain_reviews | ---
language:
- en
language_creators:
- found
license: unknown
multilinguality:
- monolingual
pretty_name: Blue
size_categories:
- 10K<n<100K
source_datasets:
- extended|app_reviews
tags:
- reviews
- ratings
- ordinal
- text
task_categories:
- text-classification
task_ids:
- text-scoring
- sentiment-scoring
---
This dataset is a quick-and-dirty benchmark for predicting ratings across
different domains and on different rating scales based on text. It pulls in a
bunch of rating datasets, takes at most 1000 instances from each and combines
them into a big dataset.
Requires the `kaggle` library to be installed, and kaggle API keys passed
through environment variables or in ~/.kaggle/kaggle.json. See [the Kaggle
docs](https://www.kaggle.com/docs/api#authentication).
|
ethansimrm/OpusTest | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 6599408.6
num_examples: 25417
download_size: 4758293
dataset_size: 6599408.6
---
# Dataset Card for "OpusTest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AWeirdDev/websites | ---
license: mit
---
|
open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b | ---
pretty_name: Evaluation run of andrijdavid/Macaroni-v2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andrijdavid/Macaroni-v2-7b](https://huggingface.co/andrijdavid/Macaroni-v2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T17:13:58.096969](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b/blob/main/results_2024-02-09T17-13-58.096969.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6097753596221166,\n\
\ \"acc_stderr\": 0.032837742881645295,\n \"acc_norm\": 0.6176689414756206,\n\
\ \"acc_norm_stderr\": 0.03357785407659726,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6706721305702877,\n\
\ \"mc2_stderr\": 0.01590869964991477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7102170882294364,\n\
\ \"acc_stderr\": 0.004527343651130801,\n \"acc_norm\": 0.8383788090021908,\n\
\ \"acc_norm_stderr\": 0.0036735065123709503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949096,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949096\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906944,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906944\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646035,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646035\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n\
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621358,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621358\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677886,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n\
\ \"acc_stderr\": 0.012712265105889133,\n \"acc_norm\": 0.45241199478487615,\n\
\ \"acc_norm_stderr\": 0.012712265105889133\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6706721305702877,\n\
\ \"mc2_stderr\": 0.01590869964991477\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597207\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13419257012888552,\n \
\ \"acc_stderr\": 0.009388953419897726\n }\n}\n```"
repo_url: https://huggingface.co/andrijdavid/Macaroni-v2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|arc:challenge|25_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|gsm8k|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hellaswag|10_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-13-58.096969.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T17-13-58.096969.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- '**/details_harness|winogrande|5_2024-02-09T17-13-58.096969.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T17-13-58.096969.parquet'
- config_name: results
data_files:
- split: 2024_02_09T17_13_58.096969
path:
- results_2024-02-09T17-13-58.096969.parquet
- split: latest
path:
- results_2024-02-09T17-13-58.096969.parquet
---
# Dataset Card for Evaluation run of andrijdavid/Macaroni-v2-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andrijdavid/Macaroni-v2-7b](https://huggingface.co/andrijdavid/Macaroni-v2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T17:13:58.096969](https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b/blob/main/results_2024-02-09T17-13-58.096969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6097753596221166,
"acc_stderr": 0.032837742881645295,
"acc_norm": 0.6176689414756206,
"acc_norm_stderr": 0.03357785407659726,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6706721305702877,
"mc2_stderr": 0.01590869964991477
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407163,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.7102170882294364,
"acc_stderr": 0.004527343651130801,
"acc_norm": 0.8383788090021908,
"acc_norm_stderr": 0.0036735065123709503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949096,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949096
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906944,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906944
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646035,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646035
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.02724561304721536,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.02724561304721536
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621358,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621358
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677886,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45241199478487615,
"acc_stderr": 0.012712265105889133,
"acc_norm": 0.45241199478487615,
"acc_norm_stderr": 0.012712265105889133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6706721305702877,
"mc2_stderr": 0.01590869964991477
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597207
},
"harness|gsm8k|5": {
"acc": 0.13419257012888552,
"acc_stderr": 0.009388953419897726
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Kukedlc__NeuTrixOmniBe-DPO | ---
pretty_name: Evaluation run of Kukedlc/NeuTrixOmniBe-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuTrixOmniBe-DPO](https://huggingface.co/Kukedlc/NeuTrixOmniBe-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuTrixOmniBe-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T20:38:37.325386](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuTrixOmniBe-DPO/blob/main/results_2024-02-11T20-38-37.325386.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498204023792844,\n\
\ \"acc_stderr\": 0.03209030342573865,\n \"acc_norm\": 0.6490416180374244,\n\
\ \"acc_norm_stderr\": 0.03276416585998908,\n \"mc1\": 0.6230110159118727,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.7721852797961962,\n\
\ \"mc2_stderr\": 0.013889279661845924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7184300341296929,\n \"acc_stderr\": 0.013143376735009019,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7146982672774348,\n\
\ \"acc_stderr\": 0.004506351723820959,\n \"acc_norm\": 0.8903604859589723,\n\
\ \"acc_norm_stderr\": 0.003118013608669293\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834846,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834846\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997105,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997105\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6230110159118727,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.7721852797961962,\n\
\ \"mc2_stderr\": 0.013889279661845924\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \
\ \"acc_stderr\": 0.012815868296721364\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuTrixOmniBe-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|arc:challenge|25_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|arc:challenge|25_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|gsm8k|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|gsm8k|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hellaswag|10_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hellaswag|10_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T08-04-47.890173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-38-37.325386.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T20-38-37.325386.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- '**/details_harness|winogrande|5_2024-02-11T08-04-47.890173.parquet'
- split: 2024_02_11T20_38_37.325386
path:
- '**/details_harness|winogrande|5_2024-02-11T20-38-37.325386.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T20-38-37.325386.parquet'
- config_name: results
data_files:
- split: 2024_02_11T08_04_47.890173
path:
- results_2024-02-11T08-04-47.890173.parquet
- split: 2024_02_11T20_38_37.325386
path:
- results_2024-02-11T20-38-37.325386.parquet
- split: latest
path:
- results_2024-02-11T20-38-37.325386.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuTrixOmniBe-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuTrixOmniBe-DPO](https://huggingface.co/Kukedlc/NeuTrixOmniBe-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuTrixOmniBe-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T20:38:37.325386](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuTrixOmniBe-DPO/blob/main/results_2024-02-11T20-38-37.325386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6498204023792844,
"acc_stderr": 0.03209030342573865,
"acc_norm": 0.6490416180374244,
"acc_norm_stderr": 0.03276416585998908,
"mc1": 0.6230110159118727,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.7721852797961962,
"mc2_stderr": 0.013889279661845924
},
"harness|arc:challenge|25": {
"acc": 0.7184300341296929,
"acc_stderr": 0.013143376735009019,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7146982672774348,
"acc_stderr": 0.004506351723820959,
"acc_norm": 0.8903604859589723,
"acc_norm_stderr": 0.003118013608669293
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834846,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834846
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997105,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6230110159118727,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.7721852797961962,
"mc2_stderr": 0.013889279661845924
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.6830932524639879,
"acc_stderr": 0.012815868296721364
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713006174 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 309856
num_examples: 836
download_size: 166535
dataset_size: 309856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/queen | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/queen"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.622527 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/30a049d2de687550227ba815650eb196.585x585x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/queen">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Queen</div>
<a href="https://genius.com/artists/queen">
<div style="text-align: center; font-size: 14px;">@queen</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/queen).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/queen")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|580| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/queen")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
CaioConteudos/criancafeliz | ---
license: openrail
---
|
hemachandher/pathdataset | ---
dataset_info:
features:
- name: image
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 581
num_examples: 2
download_size: 3053
dataset_size: 581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ctang/gpt_deontology_eval_llama2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: scenario
dtype: string
- name: excuse
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1856
num_examples: 10
download_size: 3920
dataset_size: 1856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-30 | ---
pretty_name: Evaluation run of wang7776/vicuna-7b-v1.3-attention-sparsity-30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/vicuna-7b-v1.3-attention-sparsity-30](https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-30\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T22:20:40.469110](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-30/blob/main/results_2024-01-26T22-20-40.469110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4679611078395129,\n\
\ \"acc_stderr\": 0.034431867302886984,\n \"acc_norm\": 0.47400106260370506,\n\
\ \"acc_norm_stderr\": 0.03521417072130731,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589667,\n \"mc2\": 0.4606430363617052,\n\
\ \"mc2_stderr\": 0.0149404570249728\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866977,\n\
\ \"acc_norm\": 0.5102389078498294,\n \"acc_norm_stderr\": 0.014608326906285015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5689105755825533,\n\
\ \"acc_stderr\": 0.004942164585991471,\n \"acc_norm\": 0.7640908185620394,\n\
\ \"acc_norm_stderr\": 0.004236980145344305\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998575,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104284,\n\
\ \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104284\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.025275892070240634,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.025275892070240634\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6256880733944954,\n \"acc_stderr\": 0.020748959408988313,\n \"\
acc_norm\": 0.6256880733944954,\n \"acc_norm_stderr\": 0.020748959408988313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138938,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138938\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674057,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674057\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6564495530012772,\n\
\ \"acc_stderr\": 0.016982145632652462,\n \"acc_norm\": 0.6564495530012772,\n\
\ \"acc_norm_stderr\": 0.016982145632652462\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.028384256704883037,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.028384256704883037\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n\
\ \"acc_stderr\": 0.028355633568328174,\n \"acc_norm\": 0.5273311897106109,\n\
\ \"acc_norm_stderr\": 0.028355633568328174\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.02774431344337654,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.02774431344337654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3428943937418514,\n\
\ \"acc_stderr\": 0.012123463271585892,\n \"acc_norm\": 0.3428943937418514,\n\
\ \"acc_norm_stderr\": 0.012123463271585892\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.020017629214213097,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.020017629214213097\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.45454545454545453,\n\
\ \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.45454545454545453,\n\
\ \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n\
\ \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n\
\ \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589667,\n \"mc2\": 0.4606430363617052,\n\
\ \"mc2_stderr\": 0.0149404570249728\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6929755327545383,\n \"acc_stderr\": 0.012963688616969471\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \
\ \"acc_stderr\": 0.009065050306776911\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|arc:challenge|25_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|gsm8k|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hellaswag|10_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T22-20-40.469110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T22-20-40.469110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- '**/details_harness|winogrande|5_2024-01-26T22-20-40.469110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T22-20-40.469110.parquet'
- config_name: results
data_files:
- split: 2024_01_26T22_20_40.469110
path:
- results_2024-01-26T22-20-40.469110.parquet
- split: latest
path:
- results_2024-01-26T22-20-40.469110.parquet
---
# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-attention-sparsity-30
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/vicuna-7b-v1.3-attention-sparsity-30](https://huggingface.co/wang7776/vicuna-7b-v1.3-attention-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-30",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T22:20:40.469110](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-attention-sparsity-30/blob/main/results_2024-01-26T22-20-40.469110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4679611078395129,
"acc_stderr": 0.034431867302886984,
"acc_norm": 0.47400106260370506,
"acc_norm_stderr": 0.03521417072130731,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589667,
"mc2": 0.4606430363617052,
"mc2_stderr": 0.0149404570249728
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866977,
"acc_norm": 0.5102389078498294,
"acc_norm_stderr": 0.014608326906285015
},
"harness|hellaswag|10": {
"acc": 0.5689105755825533,
"acc_stderr": 0.004942164585991471,
"acc_norm": 0.7640908185620394,
"acc_norm_stderr": 0.004236980145344305
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.03423465100104284,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.03423465100104284
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.025275892070240634,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.025275892070240634
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6256880733944954,
"acc_stderr": 0.020748959408988313,
"acc_norm": 0.6256880733944954,
"acc_norm_stderr": 0.020748959408988313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138938,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138938
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674057,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674057
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6564495530012772,
"acc_stderr": 0.016982145632652462,
"acc_norm": 0.6564495530012772,
"acc_norm_stderr": 0.016982145632652462
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5273311897106109,
"acc_stderr": 0.028355633568328174,
"acc_norm": 0.5273311897106109,
"acc_norm_stderr": 0.028355633568328174
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3428943937418514,
"acc_stderr": 0.012123463271585892,
"acc_norm": 0.3428943937418514,
"acc_norm_stderr": 0.012123463271585892
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.020017629214213097,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.020017629214213097
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.03722965741385539,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.03722965741385539
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589667,
"mc2": 0.4606430363617052,
"mc2_stderr": 0.0149404570249728
},
"harness|winogrande|5": {
"acc": 0.6929755327545383,
"acc_stderr": 0.012963688616969471
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776911
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Helsinki-NLP/opus_books | ---
annotations_creators:
- found
language_creators:
- found
language:
- ca
- de
- el
- en
- eo
- es
- fi
- fr
- hu
- it
- nl
- 'no'
- pl
- pt
- ru
- sv
license:
- other
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: OpusBooks
dataset_info:
- config_name: ca-de
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ca
- de
splits:
- name: train
num_bytes: 899553
num_examples: 4445
download_size: 609128
dataset_size: 899553
- config_name: ca-en
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ca
- en
splits:
- name: train
num_bytes: 863162
num_examples: 4605
download_size: 585612
dataset_size: 863162
- config_name: ca-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ca
- hu
splits:
- name: train
num_bytes: 886150
num_examples: 4463
download_size: 608827
dataset_size: 886150
- config_name: ca-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- ca
- nl
splits:
- name: train
num_bytes: 884811
num_examples: 4329
download_size: 594793
dataset_size: 884811
- config_name: de-en
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: train
num_bytes: 13738975
num_examples: 51467
download_size: 8797832
dataset_size: 13738975
- config_name: de-eo
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- eo
splits:
- name: train
num_bytes: 398873
num_examples: 1363
download_size: 253509
dataset_size: 398873
- config_name: de-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- es
splits:
- name: train
num_bytes: 7592451
num_examples: 27526
download_size: 4841017
dataset_size: 7592451
- config_name: de-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- fr
splits:
- name: train
num_bytes: 9544351
num_examples: 34916
download_size: 6164101
dataset_size: 9544351
- config_name: de-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- hu
splits:
- name: train
num_bytes: 13514971
num_examples: 51780
download_size: 8814744
dataset_size: 13514971
- config_name: de-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- it
splits:
- name: train
num_bytes: 7759984
num_examples: 27381
download_size: 4901036
dataset_size: 7759984
- config_name: de-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- nl
splits:
- name: train
num_bytes: 3561740
num_examples: 15622
download_size: 2290868
dataset_size: 3561740
- config_name: de-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- pt
splits:
- name: train
num_bytes: 317143
num_examples: 1102
download_size: 197768
dataset_size: 317143
- config_name: de-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- ru
splits:
- name: train
num_bytes: 5764649
num_examples: 17373
download_size: 3255537
dataset_size: 5764649
- config_name: el-en
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- el
- en
splits:
- name: train
num_bytes: 552567
num_examples: 1285
download_size: 310863
dataset_size: 552567
- config_name: el-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- el
- es
splits:
- name: train
num_bytes: 527979
num_examples: 1096
download_size: 298827
dataset_size: 527979
- config_name: el-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- el
- fr
splits:
- name: train
num_bytes: 539921
num_examples: 1237
download_size: 303181
dataset_size: 539921
- config_name: el-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- el
- hu
splits:
- name: train
num_bytes: 546278
num_examples: 1090
download_size: 313292
dataset_size: 546278
- config_name: en-eo
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- eo
splits:
- name: train
num_bytes: 386219
num_examples: 1562
download_size: 246715
dataset_size: 386219
- config_name: en-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: train
num_bytes: 25291663
num_examples: 93470
download_size: 16080303
dataset_size: 25291663
- config_name: en-fi
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fi
splits:
- name: train
num_bytes: 715027
num_examples: 3645
download_size: 467851
dataset_size: 715027
- config_name: en-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 32997043
num_examples: 127085
download_size: 20985324
dataset_size: 32997043
- config_name: en-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- hu
splits:
- name: train
num_bytes: 35256766
num_examples: 137151
download_size: 23065198
dataset_size: 35256766
- config_name: en-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- it
splits:
- name: train
num_bytes: 8993755
num_examples: 32332
download_size: 5726189
dataset_size: 8993755
- config_name: en-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- nl
splits:
- name: train
num_bytes: 10277990
num_examples: 38652
download_size: 6443323
dataset_size: 10277990
- config_name: en-no
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- 'no'
splits:
- name: train
num_bytes: 661966
num_examples: 3499
download_size: 429631
dataset_size: 661966
- config_name: en-pl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- pl
splits:
- name: train
num_bytes: 583079
num_examples: 2831
download_size: 389337
dataset_size: 583079
- config_name: en-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 309677
num_examples: 1404
download_size: 191493
dataset_size: 309677
- config_name: en-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- ru
splits:
- name: train
num_bytes: 5190856
num_examples: 17496
download_size: 2922360
dataset_size: 5190856
- config_name: en-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- sv
splits:
- name: train
num_bytes: 790773
num_examples: 3095
download_size: 516328
dataset_size: 790773
- config_name: eo-es
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- eo
- es
splits:
- name: train
num_bytes: 409579
num_examples: 1677
download_size: 265543
dataset_size: 409579
- config_name: eo-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- eo
- fr
splits:
- name: train
num_bytes: 412987
num_examples: 1588
download_size: 261689
dataset_size: 412987
- config_name: eo-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- eo
- hu
splits:
- name: train
num_bytes: 389100
num_examples: 1636
download_size: 258229
dataset_size: 389100
- config_name: eo-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- eo
- it
splits:
- name: train
num_bytes: 387594
num_examples: 1453
download_size: 248748
dataset_size: 387594
- config_name: eo-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- eo
- pt
splits:
- name: train
num_bytes: 311067
num_examples: 1259
download_size: 197021
dataset_size: 311067
- config_name: es-fi
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- fi
splits:
- name: train
num_bytes: 710450
num_examples: 3344
download_size: 467281
dataset_size: 710450
- config_name: es-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- fr
splits:
- name: train
num_bytes: 14382126
num_examples: 56319
download_size: 9164030
dataset_size: 14382126
- config_name: es-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- hu
splits:
- name: train
num_bytes: 19373967
num_examples: 78800
download_size: 12691292
dataset_size: 19373967
- config_name: es-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- it
splits:
- name: train
num_bytes: 7837667
num_examples: 28868
download_size: 5026914
dataset_size: 7837667
- config_name: es-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- nl
splits:
- name: train
num_bytes: 9062341
num_examples: 32247
download_size: 5661890
dataset_size: 9062341
- config_name: es-no
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- 'no'
splits:
- name: train
num_bytes: 729113
num_examples: 3585
download_size: 473525
dataset_size: 729113
- config_name: es-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- pt
splits:
- name: train
num_bytes: 326872
num_examples: 1327
download_size: 204399
dataset_size: 326872
- config_name: es-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- ru
splits:
- name: train
num_bytes: 5281106
num_examples: 16793
download_size: 2995191
dataset_size: 5281106
- config_name: fi-fr
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fi
- fr
splits:
- name: train
num_bytes: 746085
num_examples: 3537
download_size: 486904
dataset_size: 746085
- config_name: fi-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fi
- hu
splits:
- name: train
num_bytes: 746602
num_examples: 3504
download_size: 509394
dataset_size: 746602
- config_name: fi-no
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fi
- 'no'
splits:
- name: train
num_bytes: 691169
num_examples: 3414
download_size: 449501
dataset_size: 691169
- config_name: fi-pl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fi
- pl
splits:
- name: train
num_bytes: 613779
num_examples: 2814
download_size: 410258
dataset_size: 613779
- config_name: fr-hu
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- hu
splits:
- name: train
num_bytes: 22483025
num_examples: 89337
download_size: 14689840
dataset_size: 22483025
- config_name: fr-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- it
splits:
- name: train
num_bytes: 4752147
num_examples: 14692
download_size: 3040617
dataset_size: 4752147
- config_name: fr-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- nl
splits:
- name: train
num_bytes: 10408088
num_examples: 40017
download_size: 6528881
dataset_size: 10408088
- config_name: fr-no
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- 'no'
splits:
- name: train
num_bytes: 692774
num_examples: 3449
download_size: 449136
dataset_size: 692774
- config_name: fr-pl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- pl
splits:
- name: train
num_bytes: 614236
num_examples: 2825
download_size: 408295
dataset_size: 614236
- config_name: fr-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- pt
splits:
- name: train
num_bytes: 324604
num_examples: 1263
download_size: 198700
dataset_size: 324604
- config_name: fr-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- ru
splits:
- name: train
num_bytes: 2474198
num_examples: 8197
download_size: 1425660
dataset_size: 2474198
- config_name: fr-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- sv
splits:
- name: train
num_bytes: 833541
num_examples: 3002
download_size: 545599
dataset_size: 833541
- config_name: hu-it
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- it
splits:
- name: train
num_bytes: 8445537
num_examples: 30949
download_size: 5477452
dataset_size: 8445537
- config_name: hu-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- nl
splits:
- name: train
num_bytes: 10814113
num_examples: 43428
download_size: 6985092
dataset_size: 10814113
- config_name: hu-no
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- 'no'
splits:
- name: train
num_bytes: 695485
num_examples: 3410
download_size: 465904
dataset_size: 695485
- config_name: hu-pl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- pl
splits:
- name: train
num_bytes: 616149
num_examples: 2859
download_size: 425988
dataset_size: 616149
- config_name: hu-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- pt
splits:
- name: train
num_bytes: 302960
num_examples: 1184
download_size: 193053
dataset_size: 302960
- config_name: hu-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- hu
- ru
splits:
- name: train
num_bytes: 7818652
num_examples: 26127
download_size: 4528613
dataset_size: 7818652
- config_name: it-nl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- it
- nl
splits:
- name: train
num_bytes: 1328293
num_examples: 2359
download_size: 824780
dataset_size: 1328293
- config_name: it-pt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- it
- pt
splits:
- name: train
num_bytes: 301416
num_examples: 1163
download_size: 190005
dataset_size: 301416
- config_name: it-ru
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- it
- ru
splits:
- name: train
num_bytes: 5316928
num_examples: 17906
download_size: 2997871
dataset_size: 5316928
- config_name: it-sv
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- it
- sv
splits:
- name: train
num_bytes: 811401
num_examples: 2998
download_size: 527303
dataset_size: 811401
configs:
- config_name: ca-de
data_files:
- split: train
path: ca-de/train-*
- config_name: ca-en
data_files:
- split: train
path: ca-en/train-*
- config_name: ca-hu
data_files:
- split: train
path: ca-hu/train-*
- config_name: ca-nl
data_files:
- split: train
path: ca-nl/train-*
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- config_name: de-eo
data_files:
- split: train
path: de-eo/train-*
- config_name: de-es
data_files:
- split: train
path: de-es/train-*
- config_name: de-fr
data_files:
- split: train
path: de-fr/train-*
- config_name: de-hu
data_files:
- split: train
path: de-hu/train-*
- config_name: de-it
data_files:
- split: train
path: de-it/train-*
- config_name: de-nl
data_files:
- split: train
path: de-nl/train-*
- config_name: de-pt
data_files:
- split: train
path: de-pt/train-*
- config_name: de-ru
data_files:
- split: train
path: de-ru/train-*
- config_name: el-en
data_files:
- split: train
path: el-en/train-*
- config_name: el-es
data_files:
- split: train
path: el-es/train-*
- config_name: el-fr
data_files:
- split: train
path: el-fr/train-*
- config_name: el-hu
data_files:
- split: train
path: el-hu/train-*
- config_name: en-eo
data_files:
- split: train
path: en-eo/train-*
- config_name: en-es
data_files:
- split: train
path: en-es/train-*
- config_name: en-fi
data_files:
- split: train
path: en-fi/train-*
- config_name: en-fr
data_files:
- split: train
path: en-fr/train-*
- config_name: en-hu
data_files:
- split: train
path: en-hu/train-*
- config_name: en-it
data_files:
- split: train
path: en-it/train-*
- config_name: en-nl
data_files:
- split: train
path: en-nl/train-*
- config_name: en-no
data_files:
- split: train
path: en-no/train-*
- config_name: en-pl
data_files:
- split: train
path: en-pl/train-*
- config_name: en-pt
data_files:
- split: train
path: en-pt/train-*
- config_name: en-ru
data_files:
- split: train
path: en-ru/train-*
- config_name: en-sv
data_files:
- split: train
path: en-sv/train-*
- config_name: eo-es
data_files:
- split: train
path: eo-es/train-*
- config_name: eo-fr
data_files:
- split: train
path: eo-fr/train-*
- config_name: eo-hu
data_files:
- split: train
path: eo-hu/train-*
- config_name: eo-it
data_files:
- split: train
path: eo-it/train-*
- config_name: eo-pt
data_files:
- split: train
path: eo-pt/train-*
- config_name: es-fi
data_files:
- split: train
path: es-fi/train-*
- config_name: es-fr
data_files:
- split: train
path: es-fr/train-*
- config_name: es-hu
data_files:
- split: train
path: es-hu/train-*
- config_name: es-it
data_files:
- split: train
path: es-it/train-*
- config_name: es-nl
data_files:
- split: train
path: es-nl/train-*
- config_name: es-no
data_files:
- split: train
path: es-no/train-*
- config_name: es-pt
data_files:
- split: train
path: es-pt/train-*
- config_name: es-ru
data_files:
- split: train
path: es-ru/train-*
- config_name: fi-fr
data_files:
- split: train
path: fi-fr/train-*
- config_name: fi-hu
data_files:
- split: train
path: fi-hu/train-*
- config_name: fi-no
data_files:
- split: train
path: fi-no/train-*
- config_name: fi-pl
data_files:
- split: train
path: fi-pl/train-*
- config_name: fr-hu
data_files:
- split: train
path: fr-hu/train-*
- config_name: fr-it
data_files:
- split: train
path: fr-it/train-*
- config_name: fr-nl
data_files:
- split: train
path: fr-nl/train-*
- config_name: fr-no
data_files:
- split: train
path: fr-no/train-*
- config_name: fr-pl
data_files:
- split: train
path: fr-pl/train-*
- config_name: fr-pt
data_files:
- split: train
path: fr-pt/train-*
- config_name: fr-ru
data_files:
- split: train
path: fr-ru/train-*
- config_name: fr-sv
data_files:
- split: train
path: fr-sv/train-*
- config_name: hu-it
data_files:
- split: train
path: hu-it/train-*
- config_name: hu-nl
data_files:
- split: train
path: hu-nl/train-*
- config_name: hu-no
data_files:
- split: train
path: hu-no/train-*
- config_name: hu-pl
data_files:
- split: train
path: hu-pl/train-*
- config_name: hu-pt
data_files:
- split: train
path: hu-pt/train-*
- config_name: hu-ru
data_files:
- split: train
path: hu-ru/train-*
- config_name: it-nl
data_files:
- split: train
path: it-nl/train-*
- config_name: it-pt
data_files:
- split: train
path: it-pt/train-*
- config_name: it-ru
data_files:
- split: train
path: it-ru/train-*
- config_name: it-sv
data_files:
- split: train
path: it-sv/train-*
---
# Dataset Card for OPUS Books
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://opus.nlpl.eu/Books/corpus/version/Books
- **Repository:** [More Information Needed]
- **Paper:** https://aclanthology.org/L12-1246/
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
This is a collection of copyright free books aligned by Andras Farkas, which are available from http://www.farkastranslations.com/bilingual_books.php
Note that the texts are rather dated due to copyright issues and that some of them are manually reviewed (check the meta-data at the top of the corpus files in XML). The source is multilingually aligned, which is available from http://www.farkastranslations.com/bilingual_books.php.
In OPUS, the alignment is formally bilingual but the multilingual alignment can be recovered from the XCES sentence alignment files. Note also that the alignment units from the original source may include multi-sentence paragraphs, which are split and sentence-aligned in OPUS.
All texts are freely available for personal, educational and research use. Commercial use (e.g. reselling as parallel books) and mass redistribution without explicit permission are not granted. Please acknowledge the source when using the data!
Books's Numbers:
- Languages: 16
- Bitexts: 64
- Number of files: 158
- Number of tokens: 19.50M
- Sentence fragments: 0.91M
### Supported Tasks and Leaderboards
Translation.
### Languages
The languages in the dataset are:
- ca
- de
- el
- en
- eo
- es
- fi
- fr
- hu
- it
- nl
- no
- pl
- pt
- ru
- sv
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
All texts are freely available for personal, educational and research use. Commercial use (e.g. reselling as parallel books) and mass redistribution without explicit permission are not granted.
### Citation Information
Please acknowledge the source when using the data.
Please cite the following article if you use any part of the OPUS corpus in your own work:
```bibtex
@inproceedings{tiedemann-2012-parallel,
title = "Parallel Data, Tools and Interfaces in {OPUS}",
author = {Tiedemann, J{\"o}rg},
editor = "Calzolari, Nicoletta and
Choukri, Khalid and
Declerck, Thierry and
Do{\u{g}}an, Mehmet U{\u{g}}ur and
Maegaard, Bente and
Mariani, Joseph and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Eighth International Conference on Language Resources and Evaluation ({LREC}'12)",
month = may,
year = "2012",
address = "Istanbul, Turkey",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf",
pages = "2214--2218",
}
```
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
cmudrc/MegaFlow2D | ---
license: apache-2.0
---
|
Sowmya15/profanity_27 | ---
license: apache-2.0
---
|
CATIE-AQ/newsquadfr_fr_prompt_question_generation_with_context | ---
language:
- fr
license: cc-by-nc-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- newsquadfr
---
# newsquadfr_fr_prompt_question_generation_with_context
## Summary
**newsquadfr_fr_prompt_question_generation_with_context** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **101,040** rows that can be used for a question-generation (with context) task.
The original data (without prompts) comes from the dataset [newsquadfr](https://huggingface.co/datasets/lincoln/newsquadfr) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'"'+context+'"\n Générer une question à partir du texte ci-dessus : ',
'"'+context+'"\n Génère une question à partir du texte ci-dessus : ',
'"'+context+'"\n Générez une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouver une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouve une question à partir du texte ci-dessus : ',
'"'+context+'"\n Trouvez une question à partir du texte ci-dessus : ',
'"'+context+'"\n Créer une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Crée trouver une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Créez trouver une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecrire une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecris une bonne question à partir du texte ci-dessus : ',
'"'+context+'"\n Ecrivez une bonne question à partir du texte ci-dessus : ',
'Générer une bonne question pour le texte suivant : "'+context+'"',
'Génère une bonne question pour le texte suivant : "'+context+'"',
'Générez une bonne question pour le texte suivant : "'+context+'"',
'Trouver une bonne question pour le texte suivant : "'+context+'"',
'Trouve une bonne question pour le texte suivant : "'+context+'"',
'Trouvez trouver une bonne question pour le texte suivant : "'+context+'"',
'Créer une bonne question pour le texte suivant : "'+context+'"',
'Crée trouver une bonne question pour le texte suivant : "'+context+'"',
'Créez trouver une bonne question pour le texte suivant : "'+context+'"',
'Ecrire une bonne question pour le texte suivant : "'+context+'"',
'Ecris une bonne question pour le texte suivant : "'+context+'"',
'Ecrivez une bonne question pour le texte suivant : "'+context+'"'
```
# Splits
- `train` with 79,200 samples
- `valid` with 21,800 samples
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/newsquadfr_fr_prompt_question_generation_with_context")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/lincoln/newsquadfr
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-NC-SA 4.0 |
ericyu/GVLM_Cropped_256 | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 112278146.48
num_examples: 4558
- name: test
num_bytes: 37388998.684
num_examples: 1519
- name: val
num_bytes: 37425501.773
num_examples: 1519
download_size: 186554180
dataset_size: 187092646.937
---
# Dataset Card for "GVLM_Cropped_256"
This is an official release of the GVLM-CD dataset. In this version, we cropped the images into patches of size 256*256.
If you use GVLM-CD in a scientific publication, we would appreciate using the following citations:
```
@article{zhang2023cross,
title={Cross-domain landslide mapping from large-scale remote sensing images using prototype-guided domain-aware progressive representation learning},
author={Zhang, Xiaokang and Yu, Weikang and Pun, Man-On and Shi, Wenzhong},
journal={ISPRS Journal of Photogrammetry and Remote Sensing},
volume={197},
pages={1--17},
year={2023},
publisher={Elsevier}
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nthakur/msmarco-passage-sampled-100k | ---
language:
- en
license: cc-by-sa-3.0
task_categories:
- text-retrieval
source_datasets:
- Tevatron/msmarco-passage
---
# nthakur/msmarco-passage-sampled-100k
This is a 100k randomly sampled training pairs of the Tevatron [msmarco-passage](https://huggingface.co/datasets/Tevatron/msmarco-passage) for debugging and training models on a smaller subset of MSMARCO training data.
## Citing & Authors
Have a look at [Tevatron](https://github.com/texttron/tevatron).
<!--- Describe where people can find more information --> |
edarchimbaud/timeseries-1d-stocks | ---
language:
- en
license: mit
task_categories:
- tabular-regression
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: open
dtype: float64
- name: high
dtype: float64
- name: low
dtype: float64
- name: close
dtype: float64
- name: adj_close
dtype: float64
- name: volume
dtype: float64
splits:
- name: train
num_bytes: 598131989
num_examples: 8535427
download_size: 296223107
dataset_size: 598131989
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "timeseries-daily-sp500"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://edarchimbaud.substack.com
- **Repository:** https://github.com/edarchimbaud
- **Point of Contact:** contact@edarchimbaud.com
### Dataset Summary
The timeseries-daily-sp500 dataset provides daily historical data for companies in the S&P 500 index.
### Supported Tasks and Leaderboards
The dataset can be used to train a model for systematic trading. The model performance is evaluated based on the return / risk profile of the positions taken by the model.
### Languages
[N/A]
## Dataset Structure
### Data Instances
[N/A]
### Data Fields
- symbol (string): A string representing the ticker symbol or abbreviation used to identify the company.
- date (timestamp[ns, tz=America/New_York]): A timestamp indicating the date of the recorded data. The timestamps are in the America/New_York time zone.
- open (float64): A floating-point number representing the opening price of the stock on the given date.
- high (float64): A floating-point number representing the highest price of the stock on the given date.
- low (float64): A floating-point number representing the lowest price of the stock on the given date.
- close (float64): A floating-point number representing the closing price of the stock on the given date.
- volume (int64): An integer indicating the trading volume (number of shares) of the stock on the given date.
- dividends (float64): A floating-point number representing the dividends paid by the stock on the given date.
- stock_splits (float64): A floating-point number representing any stock splits that occurred on the given date.
### Data Splits
A single split, called train.
## Dataset Creation
### Curation Rationale
The timeseries-daily-sp500 dataset was developed to support the development of low-frequency trading algorithms.
### Source Data
#### Initial Data Collection and Normalization
This data was sourced from the web, and aggregated.
### Annotations
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
The timeseries-daily-sp500 dataset was collected by https://edarchimbaud.substack.com.
### Licensing Information
The timeseries-daily-sp500 dataset is licensed under the MIT License.
### Citation Information
> https://edarchimbaud.substack.com, timeseries-daily-sp500 dataset, GitHub repository, https://github.com/edarchimbaud
### Contributions
Thanks to [@edarchimbaud](https://github.com/edarchimbaud) for adding this dataset. |
adamjweintraut/eli5_base_best_slice | ---
dataset_info:
features:
- name: q_id
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: all_answers
sequence: string
- name: num_answers
dtype: int64
- name: top_answers
sequence: string
- name: num_top_answers
dtype: int64
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 59584384
num_examples: 10000
- name: test
num_bytes: 7580127
num_examples: 1250
- name: validation
num_bytes: 7343699
num_examples: 1250
download_size: 46119305
dataset_size: 74508210
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
KAUE2006/Espirro | ---
license: openrail
---
|
zhengyun21/PMC-Patients-ReCDS | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- information retrieval
- patient similarity
- clinical decision support
size_categories:
- 100K<n<1M
---
# Dataset Card for PMC-Patients-ReCDS
## Dataset Description
- **Homepage:** https://github.com/pmc-patients/pmc-patients
- **Repository:** https://github.com/pmc-patients/pmc-patients
- **Paper:** https://arxiv.org/pdf/2202.13876.pdf
- **Leaderboard:** https://pmc-patients.github.io/
- **Point of Contact:** zhengyun21@mails.tsinghua.edu.cn
### Dataset Summary
**PMC-Patients** is a first-of-its-kind dataset consisting of 167k patient summaries extracted from case reports in PubMed Central (PMC), 3.1M patient-article relevance and 293k patient-patient similarity annotations defined by PubMed citation graph.
### Supported Tasks and Leaderboards
Based on PMC-Patients, we define two tasks to benchmark Retrieval-based Clinical Decision Support (ReCDS) systems: Patient-to-Article Retrieval (PAR) and Patient-to-Patient Retrieval (PPR).
For details, please refer to [our paper](https://arxiv.org/pdf/2202.13876.pdf) and [leaderboard](https://pmc-patients.github.io/).
### Languages
English (en).
## Dataset Structure
The PMC-Patients ReCDS benchmark is presented as retrieval tasks and the data format is the same as [BEIR](https://github.com/beir-cellar/beir) benchmark.
To be specific, there are queries, corpus, and qrels (annotations).
### Queries
ReCDS-PAR and ReCDS-PPR tasks share the same query patient set and dataset split.
For each split (train, dev, and test), queries are stored a `jsonl` file that contains a list of dictionaries, each with two fields:
- `_id`: unique query identifier represented by patient_uid.
- `text`: query text represented by patient summary text.
### Corpus
Corpus is shared by different splits. For ReCDS-PAR, the corpus contains 11.7M PubMed articles, and for ReCDS-PPR, the corpus contains 155.2k reference patients from PMC-Patients. The corpus is also presented by a `jsonl` file that contains a list of dictionaries with three fields:
- `_id`: unique document identifier represented by PMID of the PubMed article in ReCDS-PAR, and patient_uid of the candidate patient in ReCDS-PPR.
- `title`: : title of the article in ReCDS-PAR, and empty string in ReCDS-PPR.
- `text`: abstract of the article in ReCDS-PAR, and patient summary text in ReCDS-PPR.
**PAR corpus note**
Due to its large size, we fail to upload the full PAR corpus on Huggingface. Instead, we provide PMIDs of the articles we include in PAR corpus, but we recommend you to download the dataset from [Figshare](https://figshare.com/collections/PMC-Patients/6723465) which contains the full PAR corpus file.
### Qrels
Qrels are TREC-style retrieval annotation files in `tsv` format.
A qrels file contains three tab-separated columns, i.e. the query identifier, corpus identifier, and score in this order. The scores (2 or 1) indicate the relevance level in ReCDS-PAR or similarity level in ReCDS-PPR.
Note that the qrels may not be the same as `relevant_articles` and `similar_patients` in `PMC-Patients.json` due to dataset split (see our manuscript for details).
### Data Instances
**A sample of query**
{"_id": "8699387-1", "text": "A 60-year-old female patient with a medical history of hypertension came to our attention because of several neurological deficits that had developed over the last few years, significantly impairing her daily life. Four years earlier, she developed sudden weakness and hypoesthesia of the right hand. The symptoms resolved in a few days and no specific diagnostic tests were performed. Two months later, she developed hypoesthesia and weakness of the right lower limb. On neurological examination at the time, she had spastic gait, ataxia, slight pronation of the right upper limb and bilateral Babinski sign. Brain MRI showed extensive white matter hyperintensities (WMHs), so leukodystrophy was suspected. However, these WMHs were located bilaterally in the corona radiata, basal ganglia, the anterior part of the temporal lobes and the medium cerebellar peduncle (A–D), and were highly suggestive of CADASIL. Genetic testing was performed, showing heterozygous mutation of the NOTCH3 gene (c.994 C<T; exon 6). The diagnosis of CADASIL was confirmed and antiplatelet prevention therapy was started. Since then, her clinical conditions remained stable, and the lesion load was unchanged at follow-up brain MRIs for 4 years until November 2020, when the patient was diagnosed with COVID-19 after a PCR nasal swab. The patient developed only mild respiratory symptoms, not requiring hospitalization or any specific treatment. Fifteen days after the COVID-19 diagnosis, she suddenly developed aphasia, agraphia and worsened right upper limb motor deficit, but she did not seek medical attention. Some days later, she reported these symptoms to her family medical doctor, and a new brain MRI was performed, showing a subacute ischemic area in the left corona radiata (E,F). Therapy with acetylsalicylic acid was switched to clopidogrel as secondary prevention, while her symptoms improved in the next few weeks. The patient underwent a carotid doppler ultrasound and an echocardiogram, which did not reveal any pathological changes. The review of the blood pressure log, both in-hospital and the personal one the patient had kept, excluded uncontrolled hypertension."}
**A sample of qrels**
query-id corpus-id score
8647806-1 6437752-1 1
8647806-1 6946242-1 1
### Data Splits
Refer to our paper.
## Dataset Creation
If you are interested in the collection of PMC-Patients and reproducing our baselines, please refer to [this reporsitory](https://github.com/zhao-zy15/PMC-Patients).
### Citation Information
If you find PMC-Patients helpful in your research, please cite our work by:
```
@misc{zhao2023pmcpatients,
title={PMC-Patients: A Large-scale Dataset of Patient Summaries and Relations for Benchmarking Retrieval-based Clinical Decision Support Systems},
author={Zhengyun Zhao and Qiao Jin and Fangyuan Chen and Tuorui Peng and Sheng Yu},
year={2023},
eprint={2202.13876},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
CyberHarem/koharu_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koharu/下江コハル/小春 (Blue Archive)
This is the dataset of koharu/下江コハル/小春 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `pink_hair, wings, head_wings, black_wings, twintails, feathered_wings, halo, long_hair, pink_eyes, low_wings, hat, black_headwear, beret, pink_halo`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 918.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koharu_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 752.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koharu_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1337 | 1.60 GiB | [Download](https://huggingface.co/datasets/CyberHarem/koharu_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koharu_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, black_bikini, blush, floral_print, hair_bow, low_twintails, official_alternate_costume, print_bikini, small_breasts, black_bow, front-tie_bikini_top, solo, looking_at_viewer, closed_mouth, navel, collarbone, cowboy_shot, simple_background, hair_between_eyes, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, black_shirt, blush, closed_mouth, collarbone, looking_at_viewer, pink_neckerchief, school_uniform, simple_background, solo, white_background, white_sailor_collar, off_shoulder, upper_body, long_sleeves, single_bare_shoulder |
| 2 | 8 |  |  |  |  |  | 1girl, black_shirt, blush, closed_mouth, long_sleeves, looking_at_viewer, pleated_skirt, solo, white_sailor_collar, cowboy_shot, off_shoulder, collarbone, pink_neckerchief, simple_background, sleeves_past_wrists, white_background, red_skirt, serafuku, bag, pink_skirt |
| 3 | 6 |  |  |  |  |  | 1girl, black_shirt, blush, closed_mouth, long_sleeves, loose_socks, pleated_skirt, rifle, solo, white_sailor_collar, white_socks, black_footwear, full_body, holding_gun, looking_at_viewer, shoes, simple_background, sleeves_past_wrists, white_background, collarbone, kneehighs, off_shoulder, bag, pink_neckerchief, serafuku, standing |
| 4 | 6 |  |  |  |  |  | 1girl, black_shirt, blush, collarbone, long_sleeves, looking_at_viewer, off_shoulder, pink_neckerchief, pleated_skirt, school_uniform, solo, white_sailor_collar, white_socks, kneehighs, loose_socks, red_skirt, shoes, wariza, black_footwear |
| 5 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, enmaided, maid_headdress, simple_background, black_dress, closed_mouth, maid_apron, white_apron, bowtie, frilled_apron, frilled_dress, hair_bow, pink_bow, puffy_short_sleeves, white_background, collarbone, full_body, open_mouth, pink_background, portrait, shoes, small_breasts, waist_apron, white_thighhighs |
| 6 | 5 |  |  |  |  |  | detached_collar, playboy_bunny, rabbit_ears, strapless_leotard, alternate_costume, bare_shoulders, fake_animal_ears, open_mouth, small_breasts, wrist_cuffs, 1girl, ass, black_leotard, black_pantyhose, blush, bowtie, fake_tail, looking_at_viewer, rabbit_tail, solo, bandaids_on_nipples, feet, hair_between_eyes, indoors, legs, loli, multiple_girls, no_shoes, on_side, red_leotard, soles, thighband_pantyhose, toes, white_pantyhose, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bikini | blush | floral_print | hair_bow | low_twintails | official_alternate_costume | print_bikini | small_breasts | black_bow | front-tie_bikini_top | solo | looking_at_viewer | closed_mouth | navel | collarbone | cowboy_shot | simple_background | hair_between_eyes | white_background | black_shirt | pink_neckerchief | school_uniform | white_sailor_collar | off_shoulder | upper_body | long_sleeves | single_bare_shoulder | pleated_skirt | sleeves_past_wrists | red_skirt | serafuku | bag | pink_skirt | loose_socks | rifle | white_socks | black_footwear | full_body | holding_gun | shoes | kneehighs | standing | wariza | enmaided | maid_headdress | black_dress | maid_apron | white_apron | bowtie | frilled_apron | frilled_dress | pink_bow | puffy_short_sleeves | open_mouth | pink_background | portrait | waist_apron | white_thighhighs | detached_collar | playboy_bunny | rabbit_ears | strapless_leotard | alternate_costume | bare_shoulders | fake_animal_ears | wrist_cuffs | ass | black_leotard | black_pantyhose | fake_tail | rabbit_tail | bandaids_on_nipples | feet | indoors | legs | loli | multiple_girls | no_shoes | on_side | red_leotard | soles | thighband_pantyhose | toes | white_pantyhose | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:---------------|:-----------|:----------------|:-----------------------------|:---------------|:----------------|:------------|:-----------------------|:-------|:--------------------|:---------------|:--------|:-------------|:--------------|:--------------------|:--------------------|:-------------------|:--------------|:-------------------|:-----------------|:----------------------|:---------------|:-------------|:---------------|:-----------------------|:----------------|:----------------------|:------------|:-----------|:------|:-------------|:--------------|:--------|:--------------|:-----------------|:------------|:--------------|:--------|:------------|:-----------|:---------|:-----------|:-----------------|:--------------|:-------------|:--------------|:---------|:----------------|:----------------|:-----------|:----------------------|:-------------|:------------------|:-----------|:--------------|:-------------------|:------------------|:----------------|:--------------|:--------------------|:--------------------|:-----------------|:-------------------|:--------------|:------|:----------------|:------------------|:------------|:--------------|:----------------------|:-------|:----------|:-------|:-------|:-----------------|:-----------|:----------|:--------------|:--------|:----------------------|:-------|:------------------|:---------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | X | | | | | | | | | X | X | X | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | | | | | | | | X | X | X | | X | X | X | | X | X | X | | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | | | | | | X | X | X | | X | | X | | X | X | X | | X | X | | X | | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | | | | | | | | | X | X | | | X | | | | | X | X | X | X | X | | X | | X | | X | | | | X | | X | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | X | | | | X | | | X | X | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | | | | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ebrigham/DBRD | ---
license: mit
task_categories:
- text-classification
language:
- nl
pretty_name: DBRD
---
configs:
- config_name: default
data_files:
- split: train
path: train/neg/*, train/pos/*
- split: test
path: test/neg/*, test/pos/*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: integer (1 for positive, -1 for negative)
splits:
- name: train
num_examples: 20027
- name: test
num_examples: 2223
download_size: 79.1MB
dataset_size: 773,4MB
# Dataset Card for "DBRD: Dutch Book Reviews Dataset"
Translation of the [Dutch Book Review Dataset (DBRD)](https://github.com/benjaminvdb/DBRD), an extensive collection of over 110k book reviews with associated binary sentiment polarity labels. The dataset is designed for sentiment classification in Dutch and is influenced by the [Large Movie Review Dataset](http://ai.stanford.edu/~amaas/data/sentiment/).
The dataset and the scripts used for scraping the reviews from [Hebban](Hebban), a Dutch platform for book enthusiasts, can be found in the [DBRD GitHub repository](https://github.com/benjaminvdb/DBRD).
# Labels
Distribution of labels positive/negative/neutral in rounded percentages.
```
training: 50/50/ 0
test: 50/50/ 0
```
# Attribution
Please use the following citation when making use of this dataset in your work:
```citation
@article{DBLP:journals/corr/abs-1910-00896,
author = {Benjamin van der Burgh and
Suzan Verberne},
title = {The merits of Universal Language Model Fine-tuning for Small Datasets
- a case with Dutch book reviews},
journal = {CoRR},
volume = {abs/1910.00896},
year = {2019},
url = {http://arxiv.org/abs/1910.00896},
archivePrefix = {arXiv},
eprint = {1910.00896},
timestamp = {Fri, 04 Oct 2019 12:28:06 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1910-00896.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
# Acknowledgements (as per GIT repository)
This dataset was created for testing out the ULMFiT (by Jeremy Howard and Sebastian Ruder) deep learning algorithm for text classification. It is implemented in the FastAI Python library that has taught me a lot. I'd also like to thank Timo Block for making his 10kGNAD dataset publicly available and giving me a starting point for this dataset. The dataset structure based on the Large Movie Review Dataset by Andrew L. Maas et al. Thanks to Andreas van Cranenburg for pointing out a problem with the dataset.
And of course I'd like to thank all the reviewers on Hebban for having taken the time to write all these reviews. You've made both book enthousiast and NLP researchers very happy :)
---
license: mit
--- |
gweltou/web-sentences-br | ---
license: apache-2.0
language:
- br
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
---
Breton sentences from the public web. Filtered and deduplicated.
Mostly KLT orthography.
Around 1M words. |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_D_PNP_GENERIC_C_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 140602
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 140634
num_examples: 1000
download_size: 104588
dataset_size: 281236
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_D_PNP_GENERIC_C_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yunyu/wiki40b_en_100_0_split | ---
dataset_info:
features:
- name: _id
dtype: string
- name: datasets_id
dtype: int32
- name: wiki_id
dtype: string
- name: start_paragraph
dtype: int32
- name: start_character
dtype: int32
- name: end_paragraph
dtype: int32
- name: end_character
dtype: int32
- name: article_title
dtype: string
- name: section_title
dtype: string
- name: passage_text
dtype: string
splits:
- name: train
num_bytes: 12927635491
num_examples: 17553713
download_size: 7022389836
dataset_size: 12927635491
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wiki40b_en_100_0_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-12000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1121798
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
monmamo/delphine-fairheart | ---
license: cc
language:
- en
pretty_name: Delphine Fairheart
size_categories:
- n<1K
---
Delphine is a monster trainer who was born and raised in the city of Acadie. She is a person of the Dracquin race, a race of dragon-form women. She is the daughter of a Dracquin mother and a Saurander father.
These are her main physical properties:
* At adulthood she is about 6 feet tall.
* Her color pallette is shades of purple, particularly lavender.
* Her skin is rough, with a subtropical tone and lavender freckles.
* Her hair is a rich purple, with a slight and subtle curve. She usually lets it grow out only to her shoulders.
* Her pupils are a healthy purple with no blemishes.
* Her figure is full, round and thick, like that of a European dragon. Her hips are wide and her legs are thick. She has a large round belly, but she isn't fat.
* Her ears are large, rubbery and each supported by a horn. The horn and ear skin are a shade of purple consistent with her skin and hair. (Note: In some of these images she has an extra smaller ear sticking out of her hair. These ears are artifacts of AI. They are *not* part of her actual anatomy.)
* She has a tail. It is about 6 feet long, thick, and hairless. It is the same color as her skin.
* Her nose is also like that of a dragon: large and broad, with wide, round nostrils. Despite this feature, her face is cute and charming.
* Dracquins have wings, but Delphine's wings are small and undeveloped due to a lifetime of binding. (The society in which she lives has negative attitudes about wings and horns that have resulted in some unfortunate customs and assumptions.)
Delphine dresses very modestly but she has a sense of style. She prefers clothing that fits her figure well (she has a very hard time finding any that actually do). She's not afraid of being sexy but doesn't want people to ape over her figure. She's actually very self-conscious about certain parts of her body.
Notes on the included images:
* Her horns and ears have been the hardest part to reproduce consistently with AI.
* AI doesn't produce tails very well, so many of my renders of Delphine simply don't have a tail.
* I will need several commissions of her to properly train an AI model to reproduce her accurately. |
medmac01/moroccan_history_qa | ---
license: cc0-1.0
task_categories:
- question-answering
language:
- en
tags:
- history
- Morocco
pretty_name: 🇲🇦 Moroccan History Dataset for Contextual Question Answering
size_categories:
- 1K<n<10K
--- |
BlackBeenie/ultrafeedback_prompt_scores | ---
license: mit
task_categories:
- text-retrieval
- sentence-similarity
language:
- en
--- |
diwank/hinglish-dump | ---
license: mit
---
# Hinglish Dump
Raw merged dump of Hinglish (hi-EN) datasets.
## Subsets and features
Subsets:
- crowd_transliteration
- hindi_romanized_dump
- hindi_xlit
- hinge
- hinglish_norm
- news2018
```
_FEATURE_NAMES = [
"target_hinglish",
"source_hindi",
"parallel_english",
"annotations",
"raw_input",
"alternates",
]
```
|
redwoodresearch/generated_stories_easy | ---
dataset_info:
features:
- name: text
dtype: string
- name: is_correct
dtype: bool
- name: is_clean
dtype: bool
- name: overall_tamper_evidence
dtype: bool
- name: measurements
sequence: bool
- name: individual_tamper_evidence
sequence: bool
splits:
- name: train
num_bytes: 12556260
num_examples: 2544
- name: validation
num_bytes: 5203051
num_examples: 1051
download_size: 7473594
dataset_size: 17759311
---
# Dataset Card for "generated_stories_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ozziey/poems_dataset | ---
license: afl-3.0
task_categories:
- tabular-classification
language:
- en
pretty_name: Detected emotions and information for poetry dataset
size_categories:
- n<1K
--- |
rookshanks/gsm100 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: alternate_answer
dtype: string
splits:
- name: train
num_bytes: 79902
num_examples: 100
download_size: 53368
dataset_size: 79902
---
# Dataset Card for "gsm100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Veerarajank/test | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mdance/pets | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 75141719.0
num_examples: 2
download_size: 15744879
dataset_size: 75141719.0
---
# Dataset Card for "pets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hellisotherpeople/one_syllable | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- expert-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: 'one_syllable from Most Language Models can be Poets too: An AI Writing Assistant and Constrained Text Generation Studio'
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- syllable
- one_syllable
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# Dataset Card for Lipogram-e
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage**: https://github.com/Hellisotherpeople/Constrained-Text-Generation-Studio
- **Repository**: https://github.com/Hellisotherpeople/Constrained-Text-Generation-Studio
- **Paper** Most Language Models can be Poets too: An AI Writing Assistant
and Constrained Text Generation Studio
- **Leaderboard**: https://github.com/Hellisotherpeople/Constrained-Text-Generation-Studio
- **Point of Contact**: https://www.linkedin.com/in/allen-roush-27721011b/
### Dataset Summary

This is a dataset of English books which only write using one syllable at a time. At this time, the dataset only contains Robinson Crusoe — in Words of One Syllable by Lucy Aikin and Daniel Defoe
This dataset is contributed as part of a paper titled "Most Language Models can be Poets too: An AI Writing Assistant and Constrained Text Generation Studio" to appear at COLING 2022. This dataset does not appear in the paper itself, but was gathered as a candidate constrained text generation dataset.
### Supported Tasks and Leaderboards
The main task for this dataset is Constrained Text Generation - but all types of language modeling are suitable.
### Languages
English
## Dataset Structure
### Data Instances
Each is extracted directly from the available pdf or epub documents converted to txt using pandoc.
### Data Fields
Text. The name of each work appears before the work starts and again at the end, so the books can be trivially split again if necessary.
### Data Splits
None given. The way I do so in the paper is to extract the final 20% of each book, and concatenate these together. This may not be the most ideal way to do a train/test split, but I couldn't think of a better way. I did not believe randomly sampling was appropriate, but I could be wrong.
## Dataset Creation
### Curation Rationale
There are several books which claim to only be written using one syllable words. A list of them can be found here: https://diyhomeschooler.com/2017/01/25/classics-in-words-of-one-syllable-free-ebooks/
Unfortunately, after careful human inspection, it appears that only one of these works actually does reliably maintain the one syllable constraint through the whole text. Outside of proper names, I cannot spot or computationally find a single example of a more-than-one-syllable word in this whole work.
### Source Data
Robinson Crusoe — in Words of One Syllable by Lucy Aikin and Daniel Defoe
#### Initial Data Collection and Normalization
Project Gutenberg
#### Who are the source language producers?
Lucy Aikin and Daniel Defoe
### Annotations
#### Annotation process
None
#### Who are the annotators?
n/a
### Personal and Sensitive Information
None
## Considerations for Using the Data
There may be OCR conversion artifacts.
### Social Impact of Dataset
These books have existed for a awhile now, so it's unlikely that this will have dramatic Social Impact.
### Discussion of Biases
The only biases possible are related to the contents of Robinson Crusoe or the possibility of the authors changing Robinson Crusoe in some problematic way by using one-syllable words. This is unlikely, as this work was aimed at children.
### Other Known Limitations
It's possible that more works exist but were not well known enough for the authors to find them and include them. Finding such inclusions would be grounds for iteration of this dataset (e.g. a version 1.1 would be released). The goal of this project is to eventually encompass all book length english language works that do not use more than one syllable in each of their words (except for names)
## Additional Information
n/a
### Dataset Curators
Allen Roush
### Licensing Information
MIT
### Citation Information
TBA
### Contributions
Thanks to [@Hellisotherpeople](https://github.com/Hellisotherpeople) for adding this dataset.
|
celloscopeai/celloscope_28000_bangla_ner_dataset | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 4406091
num_examples: 22052
- name: validation
num_bytes: 1118019
num_examples: 2756
- name: test
num_bytes: 1101591
num_examples: 2758
download_size: 975274
dataset_size: 6625701
---
# Dataset Card for "celloscope_28000_bangla_ner_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ConvLab/sgd3 | ---
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: SGD-X v3
size_categories:
- 10K<n<100K
task_categories:
- conversational
---
# Dataset Card for SGD-X v3
- **Repository:** https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/tree/master/sgd_x
- **Paper:** https://arxiv.org/pdf/2110.06800.pdf
- **Leaderboard:** None
- **Who transforms the dataset:** Qi Zhu(zhuq96 at gmail dot com)
To use this dataset, you need to install [ConvLab-3](https://github.com/ConvLab/ConvLab-3) platform first. Then you can load the dataset via:
```
from convlab.util import load_dataset, load_ontology, load_database
dataset = load_dataset('sgd3')
ontology = load_ontology('sgd3')
database = load_database('sgd3')
```
For more usage please refer to [here](https://github.com/ConvLab/ConvLab-3/tree/master/data/unified_datasets).
### Dataset Summary
The **Schema-Guided Dialogue (SGD)** dataset consists of over 20k annotated multi-domain, task-oriented conversations between a human and a virtual assistant. These conversations involve interactions with services and APIs spanning 20 domains, such as banks, events, media, calendar, travel, and weather. For most of these domains, the dataset contains multiple different APIs, many of which have overlapping functionalities but different interfaces, which reflects common real-world scenarios. The wide range of available annotations can be used for intent prediction, slot filling, dialogue state tracking, policy imitation learning, language generation, and user simulation learning, among other tasks for developing large-scale virtual assistants. Additionally, the dataset contains unseen domains and services in the evaluation set to quantify the performance in zero-shot or few-shot settings.
The **SGD-X** dataset consists of 5 linguistic variants of every schema in the original SGD dataset. Linguistic variants were written by hundreds of paid crowd-workers. In the SGD-X directory, v1 represents the variant closest to the original schemas and v5 the farthest in terms of linguistic distance. To evaluate model performance on SGD-X schemas, dialogues must be converted using the script generate_sgdx_dialogues.py.
- **How to get the transformed data from original data:**
- Download [dstc8-schema-guided-dialogue-master.zip](https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/archive/refs/heads/master.zip).
- Modified `sgd_x/generate_sgdx_dialogues.py` as https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/issues/57
- Run `python -m sgd_x.generate_sgdx_dialogues` under `dstc8-schema-guided-dialogue-master` dir which need tensorflow installed.
- Run `python preprocess.py` in the current directory.
- **Main changes of the transformation:**
- Lower case original `act` as `intent`.
- Add `count` slot for each domain, non-categorical, find span by text matching.
- Categorize `dialogue acts` according to the `intent`.
- Concatenate multiple values using `|`.
- Retain `active_intent`, `requested_slots`, `service_call`.
- **Annotations:**
- dialogue acts, state, db_results, service_call, active_intent, requested_slots.
### Supported Tasks and Leaderboards
NLU, DST, Policy, NLG, E2E
### Languages
English
### Data Splits
| split | dialogues | utterances | avg_utt | avg_tokens | avg_domains | cat slot match(state) | cat slot match(goal) | cat slot match(dialogue act) | non-cat slot span(dialogue act) |
|------------|-------------|--------------|-----------|--------------|---------------|-------------------------|------------------------|--------------------------------|-----------------------------------|
| train | 16142 | 329964 | 20.44 | 9.75 | 1.84 | 100 | - | 100 | 100 |
| validation | 2482 | 48726 | 19.63 | 9.66 | 1.84 | 100 | - | 100 | 100 |
| test | 4201 | 84594 | 20.14 | 10.4 | 2.02 | 100 | - | 100 | 100 |
| all | 22825 | 463284 | 20.3 | 9.86 | 1.87 | 100 | - | 100 | 100 |
45 domains: ['Banks_13', 'Buses_13', 'Buses_23', 'Calendar_13', 'Events_13', 'Events_23', 'Flights_13', 'Flights_23', 'Homes_13', 'Hotels_13', 'Hotels_23', 'Hotels_33', 'Media_13', 'Movies_13', 'Music_13', 'Music_23', 'RentalCars_13', 'RentalCars_23', 'Restaurants_13', 'RideSharing_13', 'RideSharing_23', 'Services_13', 'Services_23', 'Services_33', 'Travel_13', 'Weather_13', 'Alarm_13', 'Banks_23', 'Flights_33', 'Hotels_43', 'Media_23', 'Movies_23', 'Restaurants_23', 'Services_43', 'Buses_33', 'Events_33', 'Flights_43', 'Homes_23', 'Media_33', 'Messaging_13', 'Movies_33', 'Music_33', 'Payment_13', 'RentalCars_33', 'Trains_13']
- **cat slot match**: how many values of categorical slots are in the possible values of ontology in percentage.
- **non-cat slot span**: how many values of non-categorical slots have span annotation in percentage.
### Citation
```
@inproceedings{lee2022sgd,
title={SGD-X: A Benchmark for Robust Generalization in Schema-Guided Dialogue Systems},
author={Lee, Harrison and Gupta, Raghav and Rastogi, Abhinav and Cao, Yuan and Zhang, Bin and Wu, Yonghui},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={36},
number={10},
pages={10938--10946},
year={2022}
}
```
### Licensing Information
[**CC BY-SA 4.0**](https://creativecommons.org/licenses/by-sa/4.0/) |
autoevaluate/autoeval-eval-emotion-default-98e72c-1536755281 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: Jorgeutd/sagemaker-roberta-base-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: Jorgeutd/sagemaker-roberta-base-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@neehau](https://huggingface.co/neehau) for evaluating this model. |
heliosprime/twitter_dataset_1713039013 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13515
num_examples: 30
download_size: 9002
dataset_size: 13515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713039013"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/violetevergarden | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Violet Evergarden
This is the image base of bangumi Violet Evergarden, we detected 67 characters, 4727 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 35 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 166 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 56 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 148 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 20 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 79 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 61 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 112 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 104 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 93 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 44 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 163 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 47 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 16 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 107 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 140 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 46 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 16 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 27 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 35 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 47 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 18 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 24 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 306 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 35 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 192 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 62 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 20 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 21 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 22 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 18 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 29 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 75 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 33 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 36 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 14 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 21 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 24 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 64 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 22 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 228 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 23 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 41 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 21 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 7 | [Download](45/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 46 | 14 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 13 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 22 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 17 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 9 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 18 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 14 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 21 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 1063 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 96 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 34 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 12 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 8 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 98 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 10 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 14 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 6 | [Download](62/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 63 | 6 | [Download](63/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 64 | 10 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 8 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 278 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
owanr/r1_iterater | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 5220094.0
num_examples: 13703
- name: val
num_bytes: 662984.0
num_examples: 1692
- name: test
num_bytes: 680361.0
num_examples: 1707
download_size: 0
dataset_size: 6563439.0
---
# Dataset Card for "r1_iterater"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VanessaSchenkel/handmade-dataset | ---
annotations_creators:
- found
language:
- en
- pt
language_creators:
- found
license:
- afl-3.0
multilinguality:
- translation
pretty_name: VanessaSchenkel/handmade-dataset
size_categories:
- n<1K
source_datasets:
- original
tags: []
task_categories:
- translation
task_ids: []
---
Dataset with sentences regarding professions, half of the translations are to feminine and half for masculine sentences.
How to use it:
```
from datasets import load_dataset
remote_dataset = load_dataset("VanessaSchenkel/handmade-dataset", field="data")
remote_dataset
```
Output:
```
DatasetDict({
train: Dataset({
features: ['id', 'translation'],
num_rows: 388
})
})
```
Exemple:
```
remote_dataset["train"][5]
```
Output:
```
{'id': '5',
'translation': {'english': 'the postman finished her work .',
'portuguese': 'A carteira terminou seu trabalho .'}}
``` |
Devdeshitha/clomistral7b | ---
license: mit
--- |
Renatanimareli/renatasantosss | ---
license: openrail
---
|
CyberHarem/scamp_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scamp (Kantai Collection)
This is the dataset of scamp (Kantai Collection), containing 166 images and their tags.
The core tags of this character are `long_hair, side_ponytail, hair_ornament, star_hair_ornament, hat, grey_hair, garrison_cap, aqua_headwear, hair_ribbon, ribbon, black_ribbon, grey_eyes, breasts, small_breasts, brown_eyes, hair_between_eyes, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 166 | 222.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scamp_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 166 | 118.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scamp_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 433 | 279.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scamp_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 166 | 194.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scamp_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 433 | 396.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scamp_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scamp_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, highleg_swimsuit, solo, star_(symbol), white_gloves, looking_at_viewer, short_shorts, tongue_out, white_shorts, cowboy_shot, sitting |
| 1 | 15 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, highleg_swimsuit, solo, star_(symbol), white_gloves, white_shorts, short_shorts, cowboy_shot |
| 2 | 8 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, highleg_swimsuit, short_shorts, simple_background, solo, star_(symbol), white_background, white_gloves, white_shorts, cowboy_shot, twitter_username, collarbone, dated, blush, covered_navel, one-hour_drawing_challenge |
| 3 | 6 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, highleg_swimsuit, short_shorts, simple_background, solo, star_(symbol), white_background, white_gloves, white_shorts, holding_candy, blush, cowboy_shot, collarbone, smile |
| 4 | 7 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, cowboy_shot, highleg_swimsuit, holding_candy, short_shorts, solo, star_(symbol), white_gloves, white_shorts, lollipop, tongue_out, character_name |
| 5 | 8 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, highleg_swimsuit, solo, star_(symbol), white_gloves, white_background, simple_background, collarbone, open_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_one-piece_swimsuit | competition_swimsuit | highleg_swimsuit | solo | star_(symbol) | white_gloves | looking_at_viewer | short_shorts | tongue_out | white_shorts | cowboy_shot | sitting | simple_background | white_background | twitter_username | collarbone | dated | blush | covered_navel | one-hour_drawing_challenge | holding_candy | smile | lollipop | character_name | open_mouth | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------------|:-----------------------|:-------------------|:-------|:----------------|:---------------|:--------------------|:---------------|:-------------|:---------------|:--------------|:----------|:--------------------|:-------------------|:-------------------|:-------------|:--------|:--------|:----------------|:-----------------------------|:----------------|:--------|:-----------|:-----------------|:-------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | | X | X | X | X | X | X | X | X | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | | X | X | | X | | X | | | X | X | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | | | | | | | | | | X | | X | X | | |
| 5 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | X | X | | X | | | | | | | | | X | X |
|
mboth/kaelteVersorgen-50-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': KaelteErzeugen
'1': KaelteSpeichern
'2': KaelteVerteilen
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: Komponente
dtype: string
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 27642.555450236967
num_examples: 112
- name: test
num_bytes: 32271
num_examples: 132
- name: valid
num_bytes: 32271
num_examples: 132
download_size: 51628
dataset_size: 92184.55545023696
---
# Dataset Card for "kaelteVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-70b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T03:12:02.680525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0/blob/main/results_2023-10-23T03-12-02.680525.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34312080536912754,\n\
\ \"em_stderr\": 0.004861898980661869,\n \"f1\": 0.406266778523491,\n\
\ \"f1_stderr\": 0.004698880247232182,\n \"acc\": 0.5411001733512928,\n\
\ \"acc_stderr\": 0.011156340755977264\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.34312080536912754,\n \"em_stderr\": 0.004861898980661869,\n\
\ \"f1\": 0.406266778523491,\n \"f1_stderr\": 0.004698880247232182\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24715693707354056,\n \
\ \"acc_stderr\": 0.011881764043717088\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237441\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|arc:challenge|25_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|arc:challenge|25_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T09_58_31.478487
path:
- '**/details_harness|drop|3_2023-10-19T09-58-31.478487.parquet'
- split: 2023_10_23T03_12_02.680525
path:
- '**/details_harness|drop|3_2023-10-23T03-12-02.680525.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T03-12-02.680525.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T09_58_31.478487
path:
- '**/details_harness|gsm8k|5_2023-10-19T09-58-31.478487.parquet'
- split: 2023_10_23T03_12_02.680525
path:
- '**/details_harness|gsm8k|5_2023-10-23T03-12-02.680525.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T03-12-02.680525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hellaswag|10_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hellaswag|10_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:04:11.236941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T00:04:11.236941.parquet'
- split: 2023_08_19T00_48_59.636533
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T00:48:59.636533.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T00:48:59.636533.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T09_58_31.478487
path:
- '**/details_harness|winogrande|5_2023-10-19T09-58-31.478487.parquet'
- split: 2023_10_23T03_12_02.680525
path:
- '**/details_harness|winogrande|5_2023-10-23T03-12-02.680525.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T03-12-02.680525.parquet'
- config_name: results
data_files:
- split: 2023_08_10T00_04_11.236941
path:
- results_2023-08-10T00:04:11.236941.parquet
- split: 2023_10_19T09_58_31.478487
path:
- results_2023-10-19T09-58-31.478487.parquet
- split: 2023_10_23T03_12_02.680525
path:
- results_2023-10-23T03-12-02.680525.parquet
- split: latest
path:
- results_2023-10-23T03-12-02.680525.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T03:12:02.680525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0/blob/main/results_2023-10-23T03-12-02.680525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34312080536912754,
"em_stderr": 0.004861898980661869,
"f1": 0.406266778523491,
"f1_stderr": 0.004698880247232182,
"acc": 0.5411001733512928,
"acc_stderr": 0.011156340755977264
},
"harness|drop|3": {
"em": 0.34312080536912754,
"em_stderr": 0.004861898980661869,
"f1": 0.406266778523491,
"f1_stderr": 0.004698880247232182
},
"harness|gsm8k|5": {
"acc": 0.24715693707354056,
"acc_stderr": 0.011881764043717088
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_63 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1110524464.0
num_examples: 218092
download_size: 1124852018
dataset_size: 1110524464.0
---
# Dataset Card for "chunk_63"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/partitioned_v2_standardized_15 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 102473870.31022772
num_examples: 213571
download_size: 54406448
dataset_size: 102473870.31022772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_serial_verb_go | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 19248
num_examples: 68
- name: train
num_bytes: 40541
num_examples: 142
- name: validation
num_bytes: 3535
num_examples: 13
download_size: 54763
dataset_size: 63324
---
# Dataset Card for "MULTI_VALUE_mrpc_serial_verb_go"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xiyuez/im-feeling-curious | ---
license: odc-by
task_categories:
- question-answering
- text-generation
language:
- en
pretty_name: i'm feeling curious dataset
size_categories:
- 1K<n<10K
---
This public dataset is an extract from Google's "i'm feeling curious" feature. To learn more about this feature, search for "i'm feeling curious" on Google.
Tasks: Answering open-domain questions, generating random facts.
Limitations: May contain commercial content, false information, bias, or outdated information.
Language: English only.
This public extract is licensed under the Open Data Commons Attribution License: http://opendatacommons.org/licenses/by/1.0/.
There is no canonical train/test split.
This extract contains 2761 unique rows, which may increase as more data is crawled. Near-duplicates have been removed.
While we aimed to filter non-natural language content and duplicates, some may remain. The data may also contain toxic, biased, copyrighted or erroneous content. Google has done initial filtering, but we have not verified the data.
Use this dataset at your own risk. We provide no warranty or liability.
Google is a registered trademark of Google LLC. This project is not affiliated with, endorsed or sponsored by Google.
|
ttxy/emotion | ---
language:
- code
pretty_name: "English Emotion classification"
tags:
- classification
license: "bsd"
task_categories:
- text-classification
---
一个包含六种基本情绪(愤怒、恐惧、喜悦、爱、悲伤和惊讶)的英文Twitter消息数据集
Github 链接 https://github.com/dair-ai/emotion_dataset
|
Fer2207/pedrinho | ---
license: openrail
---
|
anan-2024/twitter_dataset_1713065146 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21030
num_examples: 46
download_size: 11320
dataset_size: 21030
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
erbacher/AmbigNQ-clarifying-question | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: clar
dtype: string
- name: question
dtype: string
- name: ambig
dtype: bool
- name: input_passage
dtype: string
- name: intent
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 62693997.0
num_examples: 10000
- name: dev
num_bytes: 6291036.0
num_examples: 1001
- name: test
num_bytes: 64783344.0
num_examples: 1000
download_size: 75095693
dataset_size: 133768377.0
---
# Dataset Card for "AmbigNQ-clarifying-question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sandersaarond/Grafana-Community-Dashboards | ---
license: cc0-1.0
---
This is a raw dump of the dashboard json hosted at https://grafana.com/grafana/dashboards/, taken on 06-06-23.
Dashboards themselves are json; related metadata is retained for filtering purposes (e.g., by number of downloads) to help in identifying useful data.
Dashboards may contain many different query languages, may range across many versions of Grafana, and may be completely broken (since anyone can upload one).
JSON structure varies considerably between different dashboards, and finding any specific thing you are interested in can, in and of itself, be difficult.
No warrant of any kind is attached; if anyone wants their specific dashboard removed they should contact me, but in general this dataset is intended to be used
to inform tooling for viewing, creating, and generating dashboards, and each individual dashboard was previously offered publicly for general use. |
Nxrd/daenilset | ---
license: openrail
---
|
tfnn/Objaverse-PLY-125k | ---
license: mit
---
This is 125,160 _(out of 800,000)_ 3D models from [AllenAI Objaverse 1.0](https://huggingface.co/datasets/allenai/objaverse).
These models have had their materials & normals removed, have been normalised in scale and origin recentered, and are all triangulated.
As a result each model is much smaller in size, every model is uniform and faster to process.
This dataset is great for training networks on shapes of 3D objects.
My suggested use of this dataset is to raytrace a point-cloud of 𝑥 density from the triangle faced meshes, this can then be used to train a neural network to produce point-cloud shapes which can then be re-meshed.
The filenames will match up with the [Allen-AI Objaverse Metadata](https://huggingface.co/datasets/allenai/objaverse/tree/main/metadata).
If you want an even smaller dataset then [Plyverse-1.0](https://huggingface.co/datasets/tfnn/Plyverse-1.0) is the same pre-processing on 10,000 3D models and includes the original vertex normals.
Possible licenses of these models, as quoted from [AllenAI](https://allenai.org/):
- [CC-BY 4.0](https://creativecommons.org/licenses/by/4.0/)
- [CC-BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/)
- [CC-BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
- [CC-BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
- [CC0 1.0](https://creativecommons.org/publicdomain/zero/1.0/) |
lizhuang144/FACTUAL_Scene_Graph | ---
license: openrail
language:
- en
pretty_name: FACTUAL
size_categories:
- 10K<n<100K
---
The scene graph parsing dataset described in `FACTUAL: A Benchmark for Faithful and Consistent Textual Scene Graph Parsing`
Please see details from https://github.com/zhuang-li/FACTUAL . |
AlekseyKorshuk/cup-it-ds-pairwise-small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 16773882
num_examples: 15859
- name: validation
num_bytes: 1849922
num_examples: 1762
download_size: 11756382
dataset_size: 18623804
---
# Dataset Card for "cup-it-ds-pairwise-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
massimilianowosz/website_categories | ---
dataset_info:
features:
- name: label
dtype: int64
- name: url
dtype: string
- name: text
dtype: string
- name: main_category
dtype: string
- name: main_category_id
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2020153829
num_examples: 15186
- name: test
num_bytes: 503500228
num_examples: 3797
download_size: 752658626
dataset_size: 2523654057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-25118781-8365117 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scientific_papers
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: ['bertscore', 'meteor']
dataset_name: scientific_papers
dataset_config: pubmed
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: scientific_papers
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise_g](https://huggingface.co/Blaise_g) for evaluating this model. |
joelniklaus/mapa | ---
annotations_creators:
- other
language_creators:
- found
language:
- multilingual
- bg
- cs
- da
- de
- el
- en
- es
- et
- fi
- fr
- ga
- hu
- it
- lt
- lv
- mt
- nl
- pt
- ro
- sk
- sv
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Spanish Datasets for Sensitive Entity Detection in the Legal Domain
tags:
- named-entity-recognition-and-classification
---
# Dataset Card for Multilingual European Datasets for Sensitive Entity Detection in the Legal Domain
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **
Repository:** [Spanish](https://elrc-share.eu/repository/browse/mapa-anonymization-package-spanish/b550e1a88a8311ec9c1a00155d026706687917f92f64482587c6382175dffd76/), [Most](https://elrc-share.eu/repository/search/?q=mfsp:3222a6048a8811ec9c1a00155d0267067eb521077db54d6684fb14ce8491a391), [German, Portuguese, Slovak, Slovenian, Swedish](https://elrc-share.eu/repository/search/?q=mfsp:833df1248a8811ec9c1a00155d0267067685dcdb77064822b51cc16ab7b81a36)
- **Paper:** de Gibert Bonet, O., García Pablos, A., Cuadros, M., & Melero, M. (2022). Spanish Datasets for Sensitive
Entity Detection in the Legal Domain. Proceedings of the Language Resources and Evaluation Conference, June,
3751–3760. http://www.lrec-conf.org/proceedings/lrec2022/pdf/2022.lrec-1.400.pdf
- **Leaderboard:**
- **Point of Contact:** [Joel Niklaus](mailto:joel.niklaus.2@bfh.ch)
### Dataset Summary
The dataset consists of 12 documents (9 for Spanish due to parsing errors) taken from EUR-Lex, a multilingual corpus of court
decisions and legal dispositions in the 24 official languages of the European Union. The documents have been annotated
for named entities following the guidelines of the [MAPA project]( https://mapa-project.eu/) which foresees two
annotation level, a general and a more fine-grained one. The annotated corpus can be used for named entity recognition/classification.
### Supported Tasks and Leaderboards
The dataset supports the task of Named Entity Recognition and Classification (NERC).
### Languages
The following languages are supported: bg, cs, da, de, el, en, es, et, fi, fr, ga, hu, it, lt, lv, mt, nl, pt, ro, sk, sv
## Dataset Structure
### Data Instances
The file format is jsonl and three data splits are present (train, validation and test). Named Entity annotations are
non-overlapping.
### Data Fields
For the annotation the documents have been split into sentences. The annotations has been done on the token level.
The files contain the following data fields
- `language`: language of the sentence
- `type`: The document type of the sentence. Currently, only EUR-LEX is supported.
- `file_name`: The document file name the sentence belongs to.
- `sentence_number`: The number of the sentence inside its document.
- `tokens`: The list of tokens in the sentence.
- `coarse_grained`: The coarse-grained annotations for each token
- `fine_grained`: The fine-grained annotations for each token
As previously stated, the annotation has been conducted on a global and a more fine-grained level.
The tagset used for the global and the fine-grained named entities is the following:
- Address
- Building
- City
- Country
- Place
- Postcode
- Street
- Territory
- Amount
- Unit
- Value
- Date
- Year
- Standard Abbreviation
- Month
- Day of the Week
- Day
- Calender Event
- Person
- Age
- Email
- Ethnic Category
- Family Name
- Financial
- Given Name – Female
- Given Name – Male
- Health Insurance Number
- ID Document Number
- Initial Name
- Marital Status
- Medical Record Number
- Nationality
- Profession
- Role
- Social Security Number
- Title
- Url
- Organisation
- Time
- Vehicle
- Build Year
- Colour
- License Plate Number
- Model
- Type
The final coarse grained tagset (in IOB notation) is the following:
`['O', 'B-ORGANISATION', 'I-ORGANISATION', 'B-ADDRESS', 'I-ADDRESS', 'B-DATE', 'I-DATE', 'B-PERSON', 'I-PERSON', 'B-AMOUNT', 'I-AMOUNT', 'B-TIME', 'I-TIME']`
The final fine grained tagset (in IOB notation) is the following:
`[
'O',
'B-BUILDING',
'I-BUILDING',
'B-CITY',
'I-CITY',
'B-COUNTRY',
'I-COUNTRY',
'B-PLACE',
'I-PLACE',
'B-TERRITORY',
'I-TERRITORY',
'I-UNIT',
'B-UNIT',
'B-VALUE',
'I-VALUE',
'B-YEAR',
'I-YEAR',
'B-STANDARD ABBREVIATION',
'I-STANDARD ABBREVIATION',
'B-MONTH',
'I-MONTH',
'B-DAY',
'I-DAY',
'B-AGE',
'I-AGE',
'B-ETHNIC CATEGORY',
'I-ETHNIC CATEGORY',
'B-FAMILY NAME',
'I-FAMILY NAME',
'B-INITIAL NAME',
'I-INITIAL NAME',
'B-MARITAL STATUS',
'I-MARITAL STATUS',
'B-PROFESSION',
'I-PROFESSION',
'B-ROLE',
'I-ROLE',
'B-NATIONALITY',
'I-NATIONALITY',
'B-TITLE',
'I-TITLE',
'B-URL',
'I-URL',
'B-TYPE',
'I-TYPE',
]`
### Data Splits
Splits created by Joel Niklaus.
| language | # train files | # validation files | # test files | # train sentences | # validation sentences | # test sentences |
|:-----------|----------------:|---------------------:|---------------:|--------------------:|-------------------------:|-------------------:|
| bg | 9 | 1 | 2 | 1411 | 166 | 560 |
| cs | 9 | 1 | 2 | 1464 | 176 | 563 |
| da | 9 | 1 | 2 | 1455 | 164 | 550 |
| de | 9 | 1 | 2 | 1457 | 166 | 558 |
| el | 9 | 1 | 2 | 1529 | 174 | 584 |
| en | 9 | 1 | 2 | 893 | 98 | 408 |
| es | 7 | 1 | 1 | 806 | 248 | 155 |
| et | 9 | 1 | 2 | 1391 | 163 | 516 |
| fi | 9 | 1 | 2 | 1398 | 187 | 531 |
| fr | 9 | 1 | 2 | 1297 | 97 | 490 |
| ga | 9 | 1 | 2 | 1383 | 165 | 515 |
| hu | 9 | 1 | 2 | 1390 | 171 | 525 |
| it | 9 | 1 | 2 | 1411 | 162 | 550 |
| lt | 9 | 1 | 2 | 1413 | 173 | 548 |
| lv | 9 | 1 | 2 | 1383 | 167 | 553 |
| mt | 9 | 1 | 2 | 937 | 93 | 442 |
| nl | 9 | 1 | 2 | 1391 | 164 | 530 |
| pt | 9 | 1 | 2 | 1086 | 105 | 390 |
| ro | 9 | 1 | 2 | 1480 | 175 | 557 |
| sk | 9 | 1 | 2 | 1395 | 165 | 526 |
| sv | 9 | 1 | 2 | 1453 | 175 | 539 |
## Dataset Creation
### Curation Rationale
*„[…] to our knowledge, there exist no open resources annotated for NERC [Named Entity Recognition and Classificatio] in Spanish in the legal domain. With the
present contribution, we intend to fill this gap. With the release of the created resources for fine-tuning and
evaluation of sensitive entities detection in the legal domain, we expect to encourage the development of domain-adapted
anonymisation tools for Spanish in this field“* (de Gibert Bonet et al., 2022)
### Source Data
#### Initial Data Collection and Normalization
The dataset consists of documents taken from EUR-Lex corpus which is publicly available. No further
information on the data collection process are given in de Gibert Bonet et al. (2022).
#### Who are the source language producers?
The source language producers are presumably lawyers.
### Annotations
#### Annotation process
*"The annotation scheme consists of a complex two level hierarchy adapted to the legal domain, it follows the scheme
described in (Gianola et al., 2020) […] Level 1 entities refer to general categories (PERSON, DATE, TIME, ADDRESS...)
and level 2 entities refer to more fine-grained subcategories (given name, personal name, day, year, month...). Eur-Lex,
CPP and DE have been annotated following this annotation scheme […] The manual annotation was performed using
INCePTION (Klie et al., 2018) by a sole annotator following the guidelines provided by the MAPA consortium."* (de Gibert
Bonet et al., 2022)
#### Who are the annotators?
Only one annotator conducted the annotation. More information are not provdided in de Gibert Bonet et al. (2022).
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Note that the dataset at hand presents only a small portion of a bigger corpus as described in de Gibert Bonet et al.
(2022). At the time of writing only the annotated documents from the EUR-Lex corpus were available.
Note that the information given in this dataset card refer to the dataset version as provided by Joel Niklaus and Veton
Matoshi. The dataset at hand is intended to be part of a bigger benchmark dataset. Creating a benchmark dataset
consisting of several other datasets from different sources requires postprocessing. Therefore, the structure of the
dataset at hand, including the folder structure, may differ considerably from the original dataset. In addition to that,
differences with regard to dataset statistics as give in the respective papers can be expected. The reader is advised to
have a look at the conversion script ```convert_to_hf_dataset.py``` in order to retrace the steps for converting the
original dataset into the present jsonl-format. For further information on the original dataset structure, we refer to
the bibliographical references and the original Github repositories and/or web pages provided in this dataset card.
## Additional Information
### Dataset Curators
The names of the original dataset curators and creators can be found in references given below, in the section *Citation
Information*. Additional changes were made by Joel Niklaus ([Email](mailto:joel.niklaus.2@bfh.ch)
; [Github](https://github.com/joelniklaus)) and Veton Matoshi ([Email](mailto:veton.matoshi@bfh.ch)
; [Github](https://github.com/kapllan)).
### Licensing Information
[Attribution 4.0 International (CC BY 4.0) ](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@article{DeGibertBonet2022,
author = {{de Gibert Bonet}, Ona and {Garc{\'{i}}a Pablos}, Aitor and Cuadros, Montse and Melero, Maite},
journal = {Proceedings of the Language Resources and Evaluation Conference},
number = {June},
pages = {3751--3760},
title = {{Spanish Datasets for Sensitive Entity Detection in the Legal Domain}},
url = {https://aclanthology.org/2022.lrec-1.400},
year = {2022}
}
```
### Contributions
Thanks to [@JoelNiklaus](https://github.com/joelniklaus) and [@kapllan](https://github.com/kapllan) for adding this
dataset.
|
bigscience-data/roots_zh_wikivoyage | ---
language: zh
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh_wikivoyage
# wikivoyage_filtered
- Dataset uid: `wikivoyage_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0334 % of total
- 0.1097 % of en
- 0.0432 % of fr
- 0.0863 % of es
- 0.0084 % of zh
- 0.0892 % of vi
- 0.0464 % of indic-bn
- 0.0443 % of pt
- 0.0130 % of indic-hi
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
lionelchg/dolly_open_qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2708807.1330839125
num_examples: 3554
- name: test
num_bytes: 143290.86691608766
num_examples: 188
download_size: 1724377
dataset_size: 2852098.0
---
# Dataset Card for "dolly_open_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/xinyan_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xinyan/辛炎/辛焱 (Genshin Impact)
This is the dataset of xinyan/辛炎/辛焱 (Genshin Impact), containing 490 images and their tags.
The core tags of this character are `dark_skin, multicolored_hair, streaked_hair, dark-skinned_female, black_hair, red_hair, breasts, yellow_eyes, twintails, medium_breasts, spiked_hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 490 | 808.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xinyan_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 490 | 685.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xinyan_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1202 | 1.28 GiB | [Download](https://huggingface.co/datasets/CyberHarem/xinyan_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/xinyan_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_nails, completely_nude, looking_at_viewer, nipples, pussy, blush, navel, solo, uncensored, barefoot, feet, smile, stomach, toenail_polish, toes, collarbone, spread_legs, female_pubic_hair, spikes, hair_between_eyes, jewelry, sitting, sweat |
| 1 | 5 |  |  |  |  |  | 1girl, black_nails, cleavage, guitar, holding_instrument, looking_at_viewer, smile, solo, detached_sleeves, playing_instrument, shoulder_spikes, braid, clothing_cutout, fire, hair_between_eyes |
| 2 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, upper_body, shoulder_spikes, smile, white_background, braid, closed_mouth, simple_background, artist_name, cropped_torso, hair_between_eyes, makeup |
| 3 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, large_breasts, mosaic_censoring, nipples, penis, sex, female_pubic_hair, navel, speech_bubble, vaginal, cum, english_text, hair_between_eyes, hair_down, long_hair, open_mouth, pussy, blonde_hair, comic, detached_sleeves, fishnets, looking_at_viewer, spikes, spread_legs, thighhighs |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, erection, hetero, large_penis, solo_focus, veiny_penis, open_mouth, penis_awe, blush, close-up, dark-skinned_male, looking_at_penis, shoulder_spikes, sweat, teeth, uncensored, braid, hair_between_eyes, interracial, jewelry, makeup, nude, penis_on_face, pubic_hair, smile, tongue, very_dark_skin |
| 5 | 13 |  |  |  |  |  | 1girl, smile, solo, blue_sky, day, looking_at_viewer, outdoors, blush, cloud, navel, ocean, bare_shoulders, beach, large_breasts, spikes, stomach, thighs, cleavage, red_bikini, closed_mouth, collarbone, covered_nipples, hair_between_eyes, water |
| 6 | 11 |  |  |  |  |  | 1girl, ass, from_behind, looking_at_viewer, looking_back, solo, thighs, window, indoors, smile, thong, blush, spikes, black_panties, large_breasts, long_sleeves, black_nails, closed_mouth, cameltoe, hair_between_eyes, nail_polish, patreon_username, clothing_cutout |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_nails | completely_nude | looking_at_viewer | nipples | pussy | blush | navel | solo | uncensored | barefoot | feet | smile | stomach | toenail_polish | toes | collarbone | spread_legs | female_pubic_hair | spikes | hair_between_eyes | jewelry | sitting | sweat | cleavage | guitar | holding_instrument | detached_sleeves | playing_instrument | shoulder_spikes | braid | clothing_cutout | fire | upper_body | white_background | closed_mouth | simple_background | artist_name | cropped_torso | makeup | 1boy | hetero | large_breasts | mosaic_censoring | penis | sex | speech_bubble | vaginal | cum | english_text | hair_down | long_hair | open_mouth | blonde_hair | comic | fishnets | thighhighs | erection | large_penis | solo_focus | veiny_penis | penis_awe | close-up | dark-skinned_male | looking_at_penis | teeth | interracial | nude | penis_on_face | pubic_hair | tongue | very_dark_skin | blue_sky | day | outdoors | cloud | ocean | bare_shoulders | beach | thighs | red_bikini | covered_nipples | water | ass | from_behind | looking_back | window | indoors | thong | black_panties | long_sleeves | cameltoe | nail_polish | patreon_username |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:------------------|:--------------------|:----------|:--------|:--------|:--------|:-------|:-------------|:-----------|:-------|:--------|:----------|:-----------------|:-------|:-------------|:--------------|:--------------------|:---------|:--------------------|:----------|:----------|:--------|:-----------|:---------|:---------------------|:-------------------|:---------------------|:------------------|:--------|:------------------|:-------|:-------------|:-------------------|:---------------|:--------------------|:--------------|:----------------|:---------|:-------|:---------|:----------------|:-------------------|:--------|:------|:----------------|:----------|:------|:---------------|:------------|:------------|:-------------|:--------------|:--------|:-----------|:-------------|:-----------|:--------------|:-------------|:--------------|:------------|:-----------|:--------------------|:-------------------|:--------|:--------------|:-------|:----------------|:-------------|:---------|:-----------------|:-----------|:------|:-----------|:--------|:--------|:-----------------|:--------|:---------|:-------------|:------------------|:--------|:------|:--------------|:---------------|:---------|:----------|:--------|:----------------|:---------------|:-----------|:--------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | | | | | X | | | | X | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | | | | | X | | | | X | | | | | | | | X | | | | X | | | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | | | | X | X | X | X | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | X | | | X | | | X | | | | | | | | X | X | | X | | | | | | X | X | | | | | | | | | X | X | X | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | | | X | | | X | X | X | | | | X | X | | | X | | | X | X | | | | X | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | X | | X | | | X | | X | | | | X | | | | | | | X | X | | | | | | | | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X |
|
tingxinli/dataset6880 | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_sst2_drop_copula_be_locative | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3162
num_examples: 21
- name: test
num_bytes: 7595
num_examples: 51
- name: train
num_bytes: 83093
num_examples: 715
download_size: 44410
dataset_size: 93850
---
# Dataset Card for "MULTI_VALUE_sst2_drop_copula_be_locative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NAB1108/StockNews | ---
task_categories:
- text-classification
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3 | ---
pretty_name: Evaluation run of bardsai/jaskier-7b-dpo-v4.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T13:43:16.252848](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3/blob/main/results_2024-02-14T13-43-16.252848.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6494440624014682,\n\
\ \"acc_stderr\": 0.032086553295201554,\n \"acc_norm\": 0.6485244990130871,\n\
\ \"acc_norm_stderr\": 0.032761229277755544,\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7826940259074282,\n\
\ \"mc2_stderr\": 0.013701443041279172\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653884,\n\
\ \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989506\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n\
\ \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787942\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523367,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7826940259074282,\n\
\ \"mc2_stderr\": 0.013701443041279172\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.01009920824606559\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.012731710925078134\n }\n}\n```"
repo_url: https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|arc:challenge|25_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|gsm8k|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hellaswag|10_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T13-43-16.252848.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- '**/details_harness|winogrande|5_2024-02-14T13-43-16.252848.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T13-43-16.252848.parquet'
- config_name: results
data_files:
- split: 2024_02_14T13_43_16.252848
path:
- results_2024-02-14T13-43-16.252848.parquet
- split: latest
path:
- results_2024-02-14T13-43-16.252848.parquet
---
# Dataset Card for Evaluation run of bardsai/jaskier-7b-dpo-v4.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T13:43:16.252848](https://huggingface.co/datasets/open-llm-leaderboard/details_bardsai__jaskier-7b-dpo-v4.3/blob/main/results_2024-02-14T13-43-16.252848.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6494440624014682,
"acc_stderr": 0.032086553295201554,
"acc_norm": 0.6485244990130871,
"acc_norm_stderr": 0.032761229277755544,
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7826940259074282,
"mc2_stderr": 0.013701443041279172
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653884,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989506
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787942
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523367,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7826940259074282,
"mc2_stderr": 0.013701443041279172
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.01009920824606559
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
htdung167/common-voice-15-preprocessed-v2 | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: original_sentence
dtype: string
- name: preprocessed_sentence
dtype: string
- name: preprocessed_sentence_v2
dtype: string
splits:
- name: train
num_bytes: 94167312.04
num_examples: 2835
- name: test
num_bytes: 35016310.9
num_examples: 1290
download_size: 112326640
dataset_size: 129183622.94
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Sentdex/wsb_reddit_v002 | ---
license: apache-2.0
---
|
bigscience-data/roots_ar_tashkeela | ---
language: ar
license: gpl-2.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ar_tashkeela
# Tashkeela
- Dataset uid: `tashkeela`
### Description
The dataset collected from 97 books in both modern and classic arabic. The dataset contains Arabic diacritics. The dataset is
### Homepage
https://sourceforge.net/projects/tashkeela/
### Licensing
- gpl-2.0: GNU General Public License v2.0 only
### Speaker Locations
### Sizes
- 0.2533 % of total
- 2.3340 % of ar
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
one-sec-cv12/chunk_94 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23300092224.5
num_examples: 242588
download_size: 21582387763
dataset_size: 23300092224.5
---
# Dataset Card for "chunk_94"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp | ---
pretty_name: Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T14:51:30.669474](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2024-01-05T14-51-30.669474.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6379672317121475,\n\
\ \"acc_stderr\": 0.032268482670470874,\n \"acc_norm\": 0.6378521186236827,\n\
\ \"acc_norm_stderr\": 0.0329317188618121,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5644227760342831,\n\
\ \"mc2_stderr\": 0.015511434380507188\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.01410457836649189,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6676956781517626,\n\
\ \"acc_stderr\": 0.004700767741735563,\n \"acc_norm\": 0.8511252738498307,\n\
\ \"acc_norm_stderr\": 0.0035523745313052004\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.016501579306861674,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.016501579306861674\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598557,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598557\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169143,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169143\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5644227760342831,\n\
\ \"mc2_stderr\": 0.015511434380507188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881578\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \
\ \"acc_stderr\": 0.012570068947898767\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|arc:challenge|25_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|gsm8k|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hellaswag|10_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T14-51-30.669474.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- '**/details_harness|winogrande|5_2024-01-05T14-51-30.669474.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T14-51-30.669474.parquet'
- config_name: results
data_files:
- split: 2024_01_05T14_51_30.669474
path:
- results_2024-01-05T14-51-30.669474.parquet
- split: latest
path:
- results_2024-01-05T14-51-30.669474.parquet
---
# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T14:51:30.669474](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2024-01-05T14-51-30.669474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6379672317121475,
"acc_stderr": 0.032268482670470874,
"acc_norm": 0.6378521186236827,
"acc_norm_stderr": 0.0329317188618121,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5644227760342831,
"mc2_stderr": 0.015511434380507188
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.01410457836649189,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6676956781517626,
"acc_stderr": 0.004700767741735563,
"acc_norm": 0.8511252738498307,
"acc_norm_stderr": 0.0035523745313052004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861674,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861674
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598557,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169143,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169143
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5644227760342831,
"mc2_stderr": 0.015511434380507188
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881578
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DFKI-SLT/cross_ner | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: CrossNER is a cross-domain dataset for named entity recognition
size_categories:
- 10K<n<100K
source_datasets:
- extended|conll2003
tags:
- cross domain
- ai
- news
- music
- literature
- politics
- science
task_categories:
- token-classification
task_ids:
- named-entity-recognition
dataset_info:
- config_name: ai
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 65080
num_examples: 100
- name: validation
num_bytes: 189453
num_examples: 350
- name: test
num_bytes: 225691
num_examples: 431
download_size: 289173
dataset_size: 480224
- config_name: literature
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 63181
num_examples: 100
- name: validation
num_bytes: 244076
num_examples: 400
- name: test
num_bytes: 270092
num_examples: 416
download_size: 334380
dataset_size: 577349
- config_name: music
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 65077
num_examples: 100
- name: validation
num_bytes: 259702
num_examples: 380
- name: test
num_bytes: 327195
num_examples: 465
download_size: 414065
dataset_size: 651974
- config_name: conll2003
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 3561081
num_examples: 14041
- name: validation
num_bytes: 891431
num_examples: 3250
- name: test
num_bytes: 811470
num_examples: 3453
download_size: 2694794
dataset_size: 5263982
- config_name: politics
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 143507
num_examples: 200
- name: validation
num_bytes: 422760
num_examples: 541
- name: test
num_bytes: 472690
num_examples: 651
download_size: 724168
dataset_size: 1038957
- config_name: science
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-academicjournal
'2': I-academicjournal
'3': B-album
'4': I-album
'5': B-algorithm
'6': I-algorithm
'7': B-astronomicalobject
'8': I-astronomicalobject
'9': B-award
'10': I-award
'11': B-band
'12': I-band
'13': B-book
'14': I-book
'15': B-chemicalcompound
'16': I-chemicalcompound
'17': B-chemicalelement
'18': I-chemicalelement
'19': B-conference
'20': I-conference
'21': B-country
'22': I-country
'23': B-discipline
'24': I-discipline
'25': B-election
'26': I-election
'27': B-enzyme
'28': I-enzyme
'29': B-event
'30': I-event
'31': B-field
'32': I-field
'33': B-literarygenre
'34': I-literarygenre
'35': B-location
'36': I-location
'37': B-magazine
'38': I-magazine
'39': B-metrics
'40': I-metrics
'41': B-misc
'42': I-misc
'43': B-musicalartist
'44': I-musicalartist
'45': B-musicalinstrument
'46': I-musicalinstrument
'47': B-musicgenre
'48': I-musicgenre
'49': B-organisation
'50': I-organisation
'51': B-person
'52': I-person
'53': B-poem
'54': I-poem
'55': B-politicalparty
'56': I-politicalparty
'57': B-politician
'58': I-politician
'59': B-product
'60': I-product
'61': B-programlang
'62': I-programlang
'63': B-protein
'64': I-protein
'65': B-researcher
'66': I-researcher
'67': B-scientist
'68': I-scientist
'69': B-song
'70': I-song
'71': B-task
'72': I-task
'73': B-theory
'74': I-theory
'75': B-university
'76': I-university
'77': B-writer
'78': I-writer
splits:
- name: train
num_bytes: 121928
num_examples: 200
- name: validation
num_bytes: 276118
num_examples: 450
- name: test
num_bytes: 334181
num_examples: 543
download_size: 485191
dataset_size: 732227
---
# Dataset Card for CrossRE
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [CrossNER](https://github.com/zliucr/CrossNER)
- **Paper:** [CrossNER: Evaluating Cross-Domain Named Entity Recognition](https://arxiv.org/abs/2012.04373)
### Dataset Summary
CrossNER is a fully-labeled collected of named entity recognition (NER) data spanning over five diverse domains
(Politics, Natural Science, Music, Literature, and Artificial Intelligence) with specialized entity categories for
different domains. Additionally, CrossNER also includes unlabeled domain-related corpora for the corresponding five
domains.
For details, see the paper:
[CrossNER: Evaluating Cross-Domain Named Entity Recognition](https://arxiv.org/abs/2012.04373)
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The language data in CrossNER is in English (BCP-47 en)
## Dataset Structure
### Data Instances
#### conll2003
- **Size of downloaded dataset files:** 2.69 MB
- **Size of the generated dataset:** 5.26 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["EU", "rejects", "German", "call", "to", "boycott", "British", "lamb", "."],
"ner_tags": [49, 0, 41, 0, 0, 0, 41, 0, 0]
}
```
#### politics
- **Size of downloaded dataset files:** 0.72 MB
- **Size of the generated dataset:** 1.04 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["Parties", "with", "mainly", "Eurosceptic", "views", "are", "the", "ruling", "United", "Russia", ",", "and", "opposition", "parties", "the", "Communist", "Party", "of", "the", "Russian", "Federation", "and", "Liberal", "Democratic", "Party", "of", "Russia", "."],
"ner_tags": [0, 0, 0, 0, 0, 0, 0, 0, 55, 56, 0, 0, 0, 0, 0, 55, 56, 56, 56, 56, 56, 0, 55, 56, 56, 56, 56, 0]
}
```
#### science
- **Size of downloaded dataset files:** 0.49 MB
- **Size of the generated dataset:** 0.73 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["They", "may", "also", "use", "Adenosine", "triphosphate", ",", "Nitric", "oxide", ",", "and", "ROS", "for", "signaling", "in", "the", "same", "ways", "that", "animals", "do", "."],
"ner_tags": [0, 0, 0, 0, 15, 16, 0, 15, 16, 0, 0, 15, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
```
#### music
- **Size of downloaded dataset files:** 0.41 MB
- **Size of the generated dataset:** 0.65 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["In", "2003", ",", "the", "Stade", "de", "France", "was", "the", "primary", "site", "of", "the", "2003", "World", "Championships", "in", "Athletics", "."],
"ner_tags": [0, 0, 0, 0, 35, 36, 36, 0, 0, 0, 0, 0, 0, 29, 30, 30, 30, 30, 0]
}
```
#### literature
- **Size of downloaded dataset files:** 0.33 MB
- **Size of the generated dataset:** 0.58 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["In", "1351", ",", "during", "the", "reign", "of", "Emperor", "Toghon", "Temür", "of", "the", "Yuan", "dynasty", ",", "93rd-generation", "descendant", "Kong", "Huan", "(", "孔浣", ")", "'", "s", "2nd", "son", "Kong", "Shao", "(", "孔昭", ")", "moved", "from", "China", "to", "Korea", "during", "the", "Goryeo", ",", "and", "was", "received", "courteously", "by", "Princess", "Noguk", "(", "the", "Mongolian-born", "wife", "of", "the", "future", "king", "Gongmin", ")", "."],
"ner_tags": [0, 0, 0, 0, 0, 0, 0, 51, 52, 52, 0, 0, 21, 22, 0, 0, 0, 77, 78, 0, 77, 0, 0, 0, 0, 0, 77, 78, 0, 77, 0, 0, 0, 21, 0, 21, 0, 0, 41, 0, 0, 0, 0, 0, 0, 51, 52, 0, 0, 41, 0, 0, 0, 0, 0, 51, 0, 0]
}
```
#### ai
- **Size of downloaded dataset files:** 0.29 MB
- **Size of the generated dataset:** 0.48 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["Popular", "approaches", "of", "opinion-based", "recommender", "system", "utilize", "various", "techniques", "including", "text", "mining", ",", "information", "retrieval", ",", "sentiment", "analysis", "(", "see", "also", "Multimodal", "sentiment", "analysis", ")", "and", "deep", "learning", "X.Y.", "Feng", ",", "H.", "Zhang", ",", "Y.J.", "Ren", ",", "P.H.", "Shang", ",", "Y.", "Zhu", ",", "Y.C.", "Liang", ",", "R.C.", "Guan", ",", "D.", "Xu", ",", "(", "2019", ")", ",", ",", "21", "(", "5", ")", ":", "e12957", "."],
"ner_tags": [0, 0, 0, 59, 60, 60, 0, 0, 0, 0, 31, 32, 0, 71, 72, 0, 71, 72, 0, 0, 0, 71, 72, 72, 0, 0, 31, 32, 65, 66, 0, 65, 66, 0, 65, 66, 0, 65, 66, 0, 65, 66, 0, 65, 66, 0, 65, 66, 0, 65, 66, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
```
### Data Fields
The data fields are the same among all splits.
- `id`: the instance id of this sentence, a `string` feature.
- `tokens`: the list of tokens of this sentence, a `list` of `string` features.
- `ner_tags`: the list of entity tags, a `list` of classification labels.
```json
{"O": 0, "B-academicjournal": 1, "I-academicjournal": 2, "B-album": 3, "I-album": 4, "B-algorithm": 5, "I-algorithm": 6, "B-astronomicalobject": 7, "I-astronomicalobject": 8, "B-award": 9, "I-award": 10, "B-band": 11, "I-band": 12, "B-book": 13, "I-book": 14, "B-chemicalcompound": 15, "I-chemicalcompound": 16, "B-chemicalelement": 17, "I-chemicalelement": 18, "B-conference": 19, "I-conference": 20, "B-country": 21, "I-country": 22, "B-discipline": 23, "I-discipline": 24, "B-election": 25, "I-election": 26, "B-enzyme": 27, "I-enzyme": 28, "B-event": 29, "I-event": 30, "B-field": 31, "I-field": 32, "B-literarygenre": 33, "I-literarygenre": 34, "B-location": 35, "I-location": 36, "B-magazine": 37, "I-magazine": 38, "B-metrics": 39, "I-metrics": 40, "B-misc": 41, "I-misc": 42, "B-musicalartist": 43, "I-musicalartist": 44, "B-musicalinstrument": 45, "I-musicalinstrument": 46, "B-musicgenre": 47, "I-musicgenre": 48, "B-organisation": 49, "I-organisation": 50, "B-person": 51, "I-person": 52, "B-poem": 53, "I-poem": 54, "B-politicalparty": 55, "I-politicalparty": 56, "B-politician": 57, "I-politician": 58, "B-product": 59, "I-product": 60, "B-programlang": 61, "I-programlang": 62, "B-protein": 63, "I-protein": 64, "B-researcher": 65, "I-researcher": 66, "B-scientist": 67, "I-scientist": 68, "B-song": 69, "I-song": 70, "B-task": 71, "I-task": 72, "B-theory": 73, "I-theory": 74, "B-university": 75, "I-university": 76, "B-writer": 77, "I-writer": 78}
```
### Data Splits
| | Train | Dev | Test |
|--------------|--------|-------|-------|
| conll2003 | 14,987 | 3,466 | 3,684 |
| politics | 200 | 541 | 651 |
| science | 200 | 450 | 543 |
| music | 100 | 380 | 456 |
| literature | 100 | 400 | 416 |
| ai | 100 | 350 | 431 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{liu2020crossner,
title={CrossNER: Evaluating Cross-Domain Named Entity Recognition},
author={Zihan Liu and Yan Xu and Tiezheng Yu and Wenliang Dai and Ziwei Ji and Samuel Cahyawijaya and Andrea Madotto and Pascale Fung},
year={2020},
eprint={2012.04373},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset. |
ZidaneAdnie/test | ---
license: afl-3.0
---
|
EleutherAI/quirky_population_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 100481.52996129722
num_examples: 936
- name: validation
num_bytes: 52170.642
num_examples: 486
- name: test
num_bytes: 62218.195
num_examples: 580
download_size: 58804
dataset_size: 214870.36696129723
---
# Dataset Card for "quirky_population_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approach0/annotated-topics-good | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: problem
dtype: string
- name: query
dtype: string
- name: prompt
dtype: string
- name: solution
dtype: string
- name: ground_truth
dtype: 'null'
- name: judge_buffer
dtype: 'null'
- name: manual_query
dtype: 'null'
- name: manual_rating
dtype: int64
- name: args
dtype: string
- name: out_str
dtype: string
- name: tool_res
sequence: string
splits:
- name: test
num_bytes: 289031
num_examples: 41
download_size: 101510
dataset_size: 289031
---
# Dataset Card for "annotated-topic-good"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Eurdem__megatron_2.1_MoE_2x7B | ---
pretty_name: Evaluation run of Eurdem/megatron_2.1_MoE_2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eurdem/megatron_2.1_MoE_2x7B](https://huggingface.co/Eurdem/megatron_2.1_MoE_2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eurdem__megatron_2.1_MoE_2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T12:54:25.031286](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__megatron_2.1_MoE_2x7B/blob/main/results_2024-02-22T12-54-25.031286.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.652022473788114,\n\
\ \"acc_stderr\": 0.03202599470166097,\n \"acc_norm\": 0.6511806675502325,\n\
\ \"acc_norm_stderr\": 0.03269814932379041,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.01685096106172013,\n \"mc2\": 0.7820037549702158,\n\
\ \"mc2_stderr\": 0.013713424576835461\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7125074686317466,\n\
\ \"acc_stderr\": 0.004516681953879092,\n \"acc_norm\": 0.8893646683927504,\n\
\ \"acc_norm_stderr\": 0.003130389466833199\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886804,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590163,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590163\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903348,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903348\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.01685096106172013,\n \"mc2\": 0.7820037549702158,\n\
\ \"mc2_stderr\": 0.013713424576835461\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \
\ \"acc_stderr\": 0.01254183081546149\n }\n}\n```"
repo_url: https://huggingface.co/Eurdem/megatron_2.1_MoE_2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-54-25.031286.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-54-25.031286.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- '**/details_harness|winogrande|5_2024-02-22T12-54-25.031286.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T12-54-25.031286.parquet'
- config_name: results
data_files:
- split: 2024_02_22T12_54_25.031286
path:
- results_2024-02-22T12-54-25.031286.parquet
- split: latest
path:
- results_2024-02-22T12-54-25.031286.parquet
---
# Dataset Card for Evaluation run of Eurdem/megatron_2.1_MoE_2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eurdem/megatron_2.1_MoE_2x7B](https://huggingface.co/Eurdem/megatron_2.1_MoE_2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eurdem__megatron_2.1_MoE_2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T12:54:25.031286](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__megatron_2.1_MoE_2x7B/blob/main/results_2024-02-22T12-54-25.031286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.652022473788114,
"acc_stderr": 0.03202599470166097,
"acc_norm": 0.6511806675502325,
"acc_norm_stderr": 0.03269814932379041,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.01685096106172013,
"mc2": 0.7820037549702158,
"mc2_stderr": 0.013713424576835461
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7125074686317466,
"acc_stderr": 0.004516681953879092,
"acc_norm": 0.8893646683927504,
"acc_norm_stderr": 0.003130389466833199
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886804,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590163,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590163
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903348,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903348
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.01685096106172013,
"mc2": 0.7820037549702158,
"mc2_stderr": 0.013713424576835461
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433537
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mrmegatelo/PineScripts-Permissive | ---
license: apache-2.0
dataset_info:
features:
- name: name
dtype: string
- name: url
dtype: string
- name: author
dtype: string
- name: author_url
dtype: string
- name: likes_count
dtype: int64
- name: kind
dtype: string
- name: pine_version
dtype: int64
- name: license
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 63484122
num_examples: 7075
download_size: 19718482
dataset_size: 63484122
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
size_categories:
- 1K<n<10K
---
# PineScripts-Permissive
A dataset of **Pine Script™** scripts with premissive licenses from TradingView. |
Nexdata/Thai_Conversational_Speech_Data_by_Telephone | ---
language:
- th
task_categories:
- conversational
---
---
# Dataset Card for Nexdata/Pushtu_Conversational_Speech_Data_by_Telephone
## Description
The 1,077 Hours - Thai Conversational Speech Data involved 1,986 native speakers, developed with proper balance of gender ratio, Speakers would choose a few familiar topics out of the given list and start conversations to ensure dialogues' fluency and naturalness. The recording devices are various mobile phones. The audio format is 8kHz, 8bit, and all the speech data was recorded in quiet indoor environments. All the speech audio was manually transcribed with text content, the start and end time of each effective sentence, and speaker identification.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1210?source=Huggingface
# Specifications
## Format
8kHz, 8bit, mono channel;
## Recording Environment
quiet indoor environment, without echo;
## Recording content
dozens of topics are specified, and the speakers make dialogue under those topics while the recording is performed;
## Demographics
1,986 speakers totally, with 41% male and 59% female;
## Annotation
annotating for the transcription text, speaker identification and gender
## Device
Telephony recording system;
## Language
Thai
## Application scenarios
speech recognition; voiceprint recognition;
## Accuracy rate
the word accuracy rate is not less than 95%
# Licensing Information
Commercial License |
neelblabla/enron_labeled_email-prompts-for-llama2_7b | ---
task_categories:
- text-classification
- text-generation
language:
- en
size_categories:
- 1K<n<10K
--- |
edbeeching/godot_rl_3DCarParking | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called 3DCarParking for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_3DCarParking
```
|
gpt4life/alpaca_claud_filtered | ---
license: apache-2.0
---
# AlpaGasus Claud-filtered dataset
This is the Claud-filtered Alpaca dataset with around 5K triplets used to train [AlpaGasus-7B](https://huggingface.co/gpt4life/alpagasus-7b) and [AlpaGasus-13B](https://huggingface.co/gpt4life/alpagasus-13b). Released under the Apache-2.0 license following the Alpaca dataset.
- **Developed by:** [gpt4life](https://github.com/gpt4life)
- **Repository:** https://github.com/gpt4life/alpagasus
- **Paper:** https://arxiv.org/pdf/2307.08701.pdf |
PhaniManda/autotrain-data-demo-on-token-classification | ---
task_categories:
- token-classification
---
# AutoTrain Dataset for project: demo-on-token-classification
## Dataset Description
This dataset has been automatically processed by AutoTrain for project demo-on-token-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"I",
"will",
"be",
"traveling",
"to",
"Tokyo",
"next",
"month."
],
"tags": [
13,
13,
13,
13,
13,
1,
0,
5
]
},
{
"tokens": [
"The",
"company",
"Apple",
"Inc.",
"is",
"based",
"in",
"California."
],
"tags": [
13,
13,
3,
9,
13,
13,
13,
1
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(names=['B-DATE', 'B-LOC', 'B-MISC', 'B-ORG', 'B-PER', 'I-DATE', 'I-DATE,', 'I-LOC', 'I-MISC', 'I-ORG', 'I-ORG,', 'I-PER', 'I-PER,', 'O'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 21 |
| valid | 9 |
|
iamnguyen/cdnc_law_test | ---
dataset_info:
features:
- name: citation
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 138635
num_examples: 100
download_size: 63379
dataset_size: 138635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cdnc_law_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Imran1/finance | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 79665
num_examples: 123
download_size: 35350
dataset_size: 79665
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
delmeng/processed_bert_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 24027163200.0
num_examples: 6674212
download_size: 5887125653
dataset_size: 24027163200.0
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anashrivastava/tl_rephrase | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 147191
num_examples: 1083
download_size: 40034
dataset_size: 147191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Softage-AI/fine-tuning_dataset | ---
license: mit
language:
- en
task_categories:
- question-answering
tags:
- 'ai '
- qa
---
# Fine-tuning Dataset
## Description
This dataset contains 400 question-answer pairs for fine-tuning language models. Each pair consists of a query and an editor's answer, along with citations for the answer.
## Data attributes
The dataset is in a CSV format with the following parameters:
- Query (str): The question.
- Editor's answer (str): The answer to the question.
- Citations (list of str): A list of citations for the answer.
## Data Source
The data was extracted from a variety of sources, including websites, articles, and books. Further, the answers are curated by the prompt engineers @SoftAge.
## Potential uses
- Train a language agent to answer user queries.
- Use this dataset as a foundation for creating educational materials tailored to specific audiences.
- Analyze user prompts to understand trends, and use responses and citations for research support. |
mask-distilled-one-sec-cv12/chunk_153 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1123768756
num_examples: 220693
download_size: 1147222908
dataset_size: 1123768756
---
# Dataset Card for "chunk_153"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grosenthal/lat_en_loeb_whitaker | ---
dataset_info:
features:
- name: id
dtype: int64
- name: la
dtype: string
- name: en
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 34184558.73094817
num_examples: 89176
- name: test
num_bytes: 1899056.965474088
num_examples: 4954
- name: valid
num_bytes: 1899440.3035777363
num_examples: 4955
download_size: 24273625
dataset_size: 37983056.0
---
# Dataset Card for "lat_en_loeb_whitaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thomas2312/Pepper | ---
language:
- en
pretty_name: Da
size_categories:
- 1K<n<10K
--- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.