datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
zijwang/CrossCodeEval | ---
license: apache-2.0
---
|
djemerson7k/Skilo | ---
license: mit
---
|
CVasNLPExperiments/Caltech101_with_background_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 14844
num_examples: 100
download_size: 3874
dataset_size: 14844
---
# Dataset Card for "Caltech101_with_background_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/stepgame | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: story
dtype: string
- name: question
dtype: string
- name: label
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 95449183
num_examples: 300000
- name: test
num_bytes: 31812498
num_examples: 100000
- name: validation
num_bytes: 3178932
num_examples: 10000
download_size: 36044930
dataset_size: 130440613
---
https://github.com/ZhengxiangShi/StepGame/
```bib
@inproceedings{stepGame2022shi,
title={StepGame: A New Benchmark for Robust Multi-Hop Spatial Reasoning in Texts},
author={Shi, Zhengxiang and Zhang, Qiang and Lipani, Aldo},
volume={36},
url={https://ojs.aaai.org/index.php/AAAI/article/view/21383},
DOI={10.1609/aaai.v36i10.21383},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2022},
month={Jun.},
pages={11321-11329}
}
``` |
jxm/fiqa__gtr_base__dpr | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings_A
sequence: float32
- name: embeddings_B
sequence: float32
splits:
- name: train
num_bytes: 399096008
num_examples: 57638
download_size: 454567065
dataset_size: 399096008
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosbrahma/mental_health_conversational_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 102904
num_examples: 154
download_size: 60865
dataset_size: 102904
license: mit
task_categories:
- text-generation
- conversational
language:
- en
tags:
- medical
pretty_name: Mental Health Conversational Dataset
size_categories:
- n<1K
---
# Dataset Card for "mental_health_conversational_dataset"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
## Dataset Description
### Dataset Summary
This dataset contains conversational pair of questions and answers in a single text related to Mental Health. Dataset was curated from healthcare websites, popular blogs like WebMD and HeatlhLine, online FAQs etc. All questions and answers have been anonymized to remove any PII data and pre-processed to remove any unwanted characters.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
A data instance include a text columns which is a conversational pair of questions and answers. Questions were asked by the patients and answers were given by healthcare providers.
### Data Fields
- 'text': conversational pair of questions and answers between patient and healthcare provider.
## Dataset Creation
### Curation Rationale
Chatbots offer a readily available and accessible platform for individuals seeking support. They can be accessed anytime and anywhere, providing immediate assistance to those in need. Chatbots can offer empathetic and non-judgmental responses, providing emotional support to users. While they cannot replace human interaction entirely, they can be a helpful supplement, especially in moments of distress.
Hence, this dataset was curated to help finetune a conversational AI bot using this custom dataset which can then be deployed and be provided to the end patient as a chatbot.
### Source Data
This dataset was curated from healthcare websites, popular blogs like WebMD and HeatlhLine, online FAQs etc.
### Personal and Sensitive Information
The dataset may contain sensitive information related to mental health. All questions and answers have been anonymized to remove any PII data. |
leticis/monaka | ---
license: unknown
---
|
zbrl/d-cube | ---
license: cc-by-4.0
---
|
DucHaiten/chibi-world | ---
license: openrail
---
|
open-llm-leaderboard/details_kaitchup__Mayonnaise-4in1-02 | ---
pretty_name: Evaluation run of kaitchup/Mayonnaise-4in1-02
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kaitchup/Mayonnaise-4in1-02](https://huggingface.co/kaitchup/Mayonnaise-4in1-02)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kaitchup__Mayonnaise-4in1-02\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T14:39:43.226327](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Mayonnaise-4in1-02/blob/main/results_2024-01-27T14-39-43.226327.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6551916779138574,\n\
\ \"acc_stderr\": 0.03200844050733691,\n \"acc_norm\": 0.6543582791974909,\n\
\ \"acc_norm_stderr\": 0.03267980180170166,\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107475,\n \"mc2\": 0.6904124035444142,\n\
\ \"mc2_stderr\": 0.015168084933661277\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.01331852846053942,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523193\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7186815375423222,\n\
\ \"acc_stderr\": 0.0044872356579556735,\n \"acc_norm\": 0.8850826528579964,\n\
\ \"acc_norm_stderr\": 0.00318270383035113\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n\
\ \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107475,\n \"mc2\": 0.6904124035444142,\n\
\ \"mc2_stderr\": 0.015168084933661277\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873509\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \
\ \"acc_stderr\": 0.01249392734865963\n }\n}\n```"
repo_url: https://huggingface.co/kaitchup/Mayonnaise-4in1-02
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|arc:challenge|25_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|gsm8k|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hellaswag|10_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T14-39-43.226327.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T14-39-43.226327.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- '**/details_harness|winogrande|5_2024-01-27T14-39-43.226327.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T14-39-43.226327.parquet'
- config_name: results
data_files:
- split: 2024_01_27T14_39_43.226327
path:
- results_2024-01-27T14-39-43.226327.parquet
- split: latest
path:
- results_2024-01-27T14-39-43.226327.parquet
---
# Dataset Card for Evaluation run of kaitchup/Mayonnaise-4in1-02
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kaitchup/Mayonnaise-4in1-02](https://huggingface.co/kaitchup/Mayonnaise-4in1-02) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kaitchup__Mayonnaise-4in1-02",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T14:39:43.226327](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Mayonnaise-4in1-02/blob/main/results_2024-01-27T14-39-43.226327.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6551916779138574,
"acc_stderr": 0.03200844050733691,
"acc_norm": 0.6543582791974909,
"acc_norm_stderr": 0.03267980180170166,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107475,
"mc2": 0.6904124035444142,
"mc2_stderr": 0.015168084933661277
},
"harness|arc:challenge|25": {
"acc": 0.7056313993174061,
"acc_stderr": 0.01331852846053942,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523193
},
"harness|hellaswag|10": {
"acc": 0.7186815375423222,
"acc_stderr": 0.0044872356579556735,
"acc_norm": 0.8850826528579964,
"acc_norm_stderr": 0.00318270383035113
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107475,
"mc2": 0.6904124035444142,
"mc2_stderr": 0.015168084933661277
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873509
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maddi99/en_c_bn_qa | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2388914407.0404744
num_examples: 206108
- name: test
num_bytes: 265436221.95952561
num_examples: 22901
download_size: 1015702371
dataset_size: 2654350629.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
burtenshaw/DIBT_prompts_ranked_synthetic_mistral_os_8x7b | ---
dataset_info:
features:
- name: input
dtype: string
- name: quality
list:
- name: status
dtype: string
- name: user_id
dtype: string
- name: value
dtype: string
- name: metadata
dtype: string
- name: avg_rating
dtype: float64
- name: num_responses
dtype: int64
- name: agreement_ratio
dtype: float64
- name: raw_responses
sequence: int64
- name: kind
dtype: string
- name: cluster_description
dtype: string
- name: topic
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
dtype: 'null'
- name: rating
sequence: float64
- name: rationale
sequence: string
splits:
- name: train
num_bytes: 18209614
num_examples: 10331
download_size: 6168967
dataset_size: 18209614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RoshanVelpula/ingredients_to_recipe_llama2_format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1902688564
num_examples: 2231142
download_size: 905898488
dataset_size: 1902688564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
**Ingredients to Recipe Dataset**
- Thanks to PoojaBhati/ingredients-recipe
- This is a formatted version of the above dataset for finetuning Llama2
|
datasets-examples/doc-yaml-4 | ---
configs:
- config_name: main_data
data_files: "main_data.csv"
- config_name: additional_data
data_files: "additional_data.csv"
size_categories:
- n<1K
---
# [doc] manual configuration 4
This dataset contains two csv files at the root, and a YAML field `configs` that specifies the data files and configs.
|
open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MOE-200K | ---
pretty_name: Evaluation run of cloudyu/Yi-34Bx2-MOE-200K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/Yi-34Bx2-MOE-200K](https://huggingface.co/cloudyu/Yi-34Bx2-MOE-200K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MOE-200K\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T08:51:26.934608](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MOE-200K/blob/main/results_2024-03-25T08-51-26.934608.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7632100588021545,\n\
\ \"acc_stderr\": 0.02824761437325172,\n \"acc_norm\": 0.7667443783251159,\n\
\ \"acc_norm_stderr\": 0.0287895932444278,\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.01750285857737126,\n \"mc2\": 0.6818760711458495,\n\
\ \"mc2_stderr\": 0.014303684430177103\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6492730531766581,\n\
\ \"acc_stderr\": 0.004762223492435249,\n \"acc_norm\": 0.8463453495319657,\n\
\ \"acc_norm_stderr\": 0.003598803855460636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474945,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474945\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02628055093284806,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02628055093284806\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.048783173121456344,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456344\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7687861271676301,\n\
\ \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.7687861271676301,\n\
\ \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7380952380952381,\n \"acc_stderr\": 0.022644212615525208,\n \"\
acc_norm\": 0.7380952380952381,\n \"acc_norm_stderr\": 0.022644212615525208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9064516129032258,\n\
\ \"acc_stderr\": 0.01656575466827098,\n \"acc_norm\": 0.9064516129032258,\n\
\ \"acc_norm_stderr\": 0.01656575466827098\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199488,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8051282051282052,\n \"acc_stderr\": 0.020083167595181393,\n\
\ \"acc_norm\": 0.8051282051282052,\n \"acc_norm_stderr\": 0.020083167595181393\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \
\ \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.02404405494044049,\n \
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.02404405494044049\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5231788079470199,\n \"acc_stderr\": 0.04078093859163086,\n \"\
acc_norm\": 0.5231788079470199,\n \"acc_norm_stderr\": 0.04078093859163086\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334884,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334884\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.01849831520686538,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.01849831520686538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n\
\ \"acc_stderr\": 0.010461015338193071,\n \"acc_norm\": 0.9054916985951469,\n\
\ \"acc_norm_stderr\": 0.010461015338193071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.02038322955113502,\n\
\ \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.02038322955113502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8134078212290503,\n\
\ \"acc_stderr\": 0.013029631416358357,\n \"acc_norm\": 0.8134078212290503,\n\
\ \"acc_norm_stderr\": 0.013029631416358357\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332625,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332625\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \
\ \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5775749674054759,\n\
\ \"acc_stderr\": 0.012615600475734927,\n \"acc_norm\": 0.5775749674054759,\n\
\ \"acc_norm_stderr\": 0.012615600475734927\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8161764705882353,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.8161764705882353,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.02292300409473685,\n\
\ \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.02292300409473685\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5042839657282742,\n\
\ \"mc1_stderr\": 0.01750285857737126,\n \"mc2\": 0.6818760711458495,\n\
\ \"mc2_stderr\": 0.014303684430177103\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.01062696452997186\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7270659590598939,\n \
\ \"acc_stderr\": 0.012270381151108754\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/Yi-34Bx2-MOE-200K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|arc:challenge|25_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|gsm8k|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hellaswag|10_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T08-51-26.934608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T08-51-26.934608.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- '**/details_harness|winogrande|5_2024-03-25T08-51-26.934608.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T08-51-26.934608.parquet'
- config_name: results
data_files:
- split: 2024_03_25T08_51_26.934608
path:
- results_2024-03-25T08-51-26.934608.parquet
- split: latest
path:
- results_2024-03-25T08-51-26.934608.parquet
---
# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MOE-200K
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx2-MOE-200K](https://huggingface.co/cloudyu/Yi-34Bx2-MOE-200K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MOE-200K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T08:51:26.934608](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MOE-200K/blob/main/results_2024-03-25T08-51-26.934608.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7632100588021545,
"acc_stderr": 0.02824761437325172,
"acc_norm": 0.7667443783251159,
"acc_norm_stderr": 0.0287895932444278,
"mc1": 0.5042839657282742,
"mc1_stderr": 0.01750285857737126,
"mc2": 0.6818760711458495,
"mc2_stderr": 0.014303684430177103
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441374,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.6492730531766581,
"acc_stderr": 0.004762223492435249,
"acc_norm": 0.8463453495319657,
"acc_norm_stderr": 0.003598803855460636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474945,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02628055093284806,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02628055093284806
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.037245636197746304,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.037245636197746304
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7380952380952381,
"acc_stderr": 0.022644212615525208,
"acc_norm": 0.7380952380952381,
"acc_norm_stderr": 0.022644212615525208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9064516129032258,
"acc_stderr": 0.01656575466827098,
"acc_norm": 0.9064516129032258,
"acc_norm_stderr": 0.01656575466827098
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199488,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8051282051282052,
"acc_stderr": 0.020083167595181393,
"acc_norm": 0.8051282051282052,
"acc_norm_stderr": 0.020083167595181393
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.02404405494044049,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.02404405494044049
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5231788079470199,
"acc_stderr": 0.04078093859163086,
"acc_norm": 0.5231788079470199,
"acc_norm_stderr": 0.04078093859163086
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334884,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334884
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.01849831520686538,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.01849831520686538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193071,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.02038322955113502,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.02038322955113502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8134078212290503,
"acc_stderr": 0.013029631416358357,
"acc_norm": 0.8134078212290503,
"acc_norm_stderr": 0.013029631416358357
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332625,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332625
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614095,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5775749674054759,
"acc_stderr": 0.012615600475734927,
"acc_norm": 0.5775749674054759,
"acc_norm_stderr": 0.012615600475734927
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8161764705882353,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.8161764705882353,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.02292300409473685,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.02292300409473685
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5042839657282742,
"mc1_stderr": 0.01750285857737126,
"mc2": 0.6818760711458495,
"mc2_stderr": 0.014303684430177103
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.01062696452997186
},
"harness|gsm8k|5": {
"acc": 0.7270659590598939,
"acc_stderr": 0.012270381151108754
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maximuslee07/raqna500 | ---
license: llama2
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 436481
num_examples: 500
download_size: 229757
dataset_size: 436481
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kfahn/dog_images_demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_labeled
dtype: image
splits:
- name: train
num_bytes: 11522296.0
num_examples: 10
download_size: 11519890
dataset_size: 11522296.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jvdgoltz/dbnl.org-dutch-public-domain | ---
dataset_info:
features:
- name: meta
struct:
- name: 'Unnamed: 28'
dtype: string
- name: _jaar
dtype: int64
- name: achternaam
dtype: string
- name: bibliotheek
dtype: string
- name: categorie
dtype: int64
- name: chapter
dtype: int64
- name: druk
dtype: string
- name: edition
dtype: string
- name: geb_datum
dtype: string
- name: geb_land_code
dtype: string
- name: geb_plaats
dtype: string
- name: geb_plaats_code
dtype: string
- name: genre
dtype: string
- name: jaar
dtype: string
- name: jaar_geboren
dtype: string
- name: jaar_overlijden
dtype: string
- name: language
dtype: string
- name: maand
dtype: string
- name: overl_datum
dtype: string
- name: overl_land_code
dtype: string
- name: overl_plaats
dtype: string
- name: overl_plaats_code
dtype: string
- name: pers_id
dtype: string
- name: ppn_o
dtype: string
- name: revision_date
dtype: string
- name: section
dtype: int64
- name: text_url
dtype: string
- name: ti_id
dtype: string
- name: titel
dtype: string
- name: url
dtype: string
- name: vols
dtype: string
- name: voornaam
dtype: string
- name: voorvoegsel
dtype: string
- name: vrouw
dtype: int64
- name: text
dtype: string
- name: id
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: train.parquet
- split: validation
path: validation.parquet
task_categories:
- text-generation
- fill-mask
language:
- nl
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
license:
- cc0-1.0
---
# Dataset Card for "dbnl.org-dutch-public-domain"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [DBNL Public Domain Collection](https://www.dbnl.org/letterkunde/pd/index.php)
- **Point of Contact:** julian at vdgoltz.net
### Dataset Summary
This dataset comprises a collection of texts from the Dutch Literature in the public domain, specifically from the DBNL (Digitale Bibliotheek voor de Nederlandse Letteren) public domain collection. The collection includes books, poems, songs, and other documentation, letters, etc., that are at least 140 years old and thus free of copyright restrictions. Each entry in the dataset corresponds to one section of a chapter of a text, ensuring a granular level of detail for text analysis.
### Supported Tasks and Leaderboards
- Language Modeling
- Text Generation
- Other tasks that can benefit from historical Dutch texts
### Languages
The dataset is primarily in Dutch (nl).
## Dataset Structure
### Data Instances
A data instance corresponds to a section of a chapter of a document, including metadata such as title, author, publication year, and the text content itself.
### Data Fields
- `ti_id`: Unique text identifier
- `titel`: Title of the text
- `jaar`: Publication year
- `druk`: Edition
- `bibliotheek`: Library code
- `categorie`: Category ID
- `pers_id`: Person ID
- `voornaam`: Author's first name
- `achternaam`: Author's last name
- `url`: URL to the text
- `text_url`: URL to the text in .txt format
- `revision_date`: Date of the revision
- `edition`: Edition details
- `language`: Language of the text
- `chapter`: Chapter number
- `section`: Section number
### Data Splits
The dataset is split into training and validation sets at text level (90:10), ensuring that sections or chapters from the same document do not leak from one split to another.
## Dataset Creation
### Curation Rationale
The dataset was curated to make historical Dutch texts available for computational analysis, preserving cultural heritage and supporting research in the humanities and linguistic studies.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from the DBNL's public domain collection, normalized, and structured to facilitate computational use.
#### Who are the source language producers?
The source language producers are authors of Dutch literature whose works have entered the public domain, implying their passing at least 70 years ago.
### Annotations
The dataset does not contain annotations.
### Personal and Sensitive Information
Given the historical nature of the texts, they are free from personal and sensitive information concerns in the contemporary sense. However, they reflect the social norms, biases, and cultural contexts of their time.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset serves as a valuable resource for understanding Dutch literary heritage, cultural history, and language evolution over time. It can support diverse research agendas in computational linguistics, cultural studies, and history.
### Discussion of Biases
The texts contain biases prevalent at their time of publication, including colonialism, racism, sexism, and other societal norms of their era. Users are urged to consider these contexts critically and use the data responsibly.
### Other Known Limitations
The dataset's historical nature means it may not be suitable for applications requiring contemporary language use or norms.
## Additional Information
### Dataset Curators
This dataset was curated by https://huggingface.co/jvdgoltz, who is not affiliated with DBNL.org and did not act on their behalf. The data is sourced from the DBNL public domain collection.
### Licensing Information
The texts in this dataset are in the public domain. According to Chat-GPT 4, the best fitting license would be: Creative Commons Zero v1.0 Universal, making them legally available for use, redistribution, and adaptation by anyone for any purpose.
### Citation Information
Not applicable.
|
sminpark/ds-alpha-small-dataset | ---
license: gpl
---
|
jorgeortizfuentes/spanish_attitude | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
list:
- name: end
dtype: int64
- name: label
dtype: string
- name: start
dtype: int64
- name: annotation_agent
dtype: string
- name: vectors
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: annotated
struct:
- name: mentions
list:
- name: capitalness
dtype: string
- name: chars_length
dtype: int64
- name: density
dtype: float64
- name: label
dtype: string
- name: score
dtype: float64
- name: tokens_length
dtype: int64
- name: value
dtype: string
- name: tags
list:
- name: tag
dtype: string
- name: value
dtype: string
- name: predicted
struct:
- name: mentions
sequence: 'null'
- name: tags
sequence: 'null'
- name: text_length
dtype: int64
- name: tokens
list:
- name: capitalness
dtype: string
- name: char_end
dtype: int64
- name: char_start
dtype: int64
- name: custom
dtype: 'null'
- name: idx
dtype: int64
- name: length
dtype: int64
- name: score
dtype: 'null'
- name: tag
dtype: string
- name: value
dtype: string
- name: tokens_length
dtype: int64
splits:
- name: train
num_bytes: 3791404
num_examples: 801
download_size: 956149
dataset_size: 3791404
---
# Dataset Card for "spanish_attitude"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nykiz/pixel-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 45378.0
num_examples: 27
download_size: 55401
dataset_size: 45378.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LLM-Tuning-Safety/HEx-PHI | ---
license: other
license_name: hex-phi
license_link: https://huggingface.co/datasets/LLM-Tuning-Safety/HEx-PHI/#hex-phi-dataset-license-agreement
extra_gated_prompt: You agree to the [HEx-PHI Dataset License Agreement](https://huggingface.co/datasets/LLM-Tuning-Safety/HEx-PHI/#hex-phi-dataset-license-agreement). Also, please specify the following fields in detail (we suggest you fill in your affiliation email), based on which we will inspect and manually grant access to approved users. If you have not been granted access, please email us (see email contact from our paper) and specify more details.
extra_gated_fields:
Name: text
Email: text
Affiliation: text
Country: text
Purpose: text
configs:
- config_name: default
data_files:
- split: Category_1_Illegal_Activity
path: category_1.csv
- split: Category_2_Child_Abuse_Content
path: category_2.csv
- split: Category_3_Hate_Harass_Violence
path: category_3.csv
- split: Category_4_Malware
path: category_4.csv
- split: Category_5_Physical_Harm
path: category_5.csv
- split: Category_6_Economic_Harm
path: category_6.csv
- split: Category_7_Fraud_Deception
path: category_7.csv
- split: Category_8_Adult_Content
path: category_8.csv
- split: Category_9_Political_Campaigning
path: category_9.csv
- split: Category_10_Privacy_Violation_Activity
path: category_10.csv
- split: Category_11_Tailored_Financial_Advice
path: category_11.csv
task_categories:
- text-generation
- conversational
language:
- en
pretty_name: Human-Extended Policy-Oriented Harmful Instruction Benchmark
size_categories:
- n<1K
tags:
- harmfulness
- benchmark
---
# HEx-PHI: **H**uman-**Ex**tended **P**olicy-Oriented **H**armful **I**nstruction Benchmark
This dataset contains 330 harmful instructions (30 examples x 11 prohibited categories) for LLM harmfulness evaluation.
In our work "[Fine-tuning Aligned Language Models Compromises Safety, Even When Users Do Not Intend To!](https://arxiv.org/pdf/2310.03693.pdf)", to comprehensively cover as many harmfulness categories as possible,
we develop this new safety evaluation benchmark directly based on the exhaustive lists of prohibited use cases found in **Meta**’s Llama-2 usage policy and **OpenAI**’s usage policy.
Specifically, we gather 30 examples of harmful instructions for each of the 11 prohibited use case categories from various sources, including:
* Anthropic [Red Teaming Data](https://github.com/anthropics/hh-rlhf/tree/master/red-team-attempts)
* *AdvBench* from "[Universal and Transferable Adversarial Attacks on Aligned Language Models](https://arxiv.org/abs/2307.15043)"
* The harmful requests are further categorized & refined by human annotators, GPT-4, Claude, jailbroken GPT-3.5, jailbroken Llama2, etc.
For more details, please refer to our project website: [http://llm-tuning-safety.github.io](http://llm-tuning-safety.github.io).
## Dataset Format
The 11 categories of harmful instructions are stored separately in 11 independent files in plain CSV format:
* [category_1.csv](category_1.csv): Illegal Activity
* [category_2.csv](category_2.csv): Child Abuse Content
* [category_3.csv](category_3.csv): Hate / Harass /Violence
* [category_4.csv](category_4.csv): Malware
* [category_5.csv](category_5.csv): Physical Harm
* [category_6.csv](category_6.csv): Economic Harm
* [category_8.csv](category_7.csv): Fraud Deception
* [category_7.csv](category_8.csv): Adult Content
* [category_9.csv](category_9.csv): Political Campaigning
* [category_10.csv](category_10.csv): Privacy Violation Activity
* [category_11.csv](category_11.csv): Tailored Financial Advice
## Dataset Usage
With our 11 harmful categories, we hope HEx-PHI can help researchers comprehensively evaluate fine-grained safety risks and harmfulness underlying LLMs.
Practically, one should use HEx-PHI harmful instructions as (part of) input prompts, and *inspect*👀 whether the responses generated by the LLM satisfy the harmful intentions of the instructions.
👀In our [paper](https://arxiv.org/pdf/2310.03693.pdf), we use GPT-4 as the harmfulness judge to assign each `<user instruction, model response>` pair a harmfulness score from 1 to 5. Refer to Appendix B for details.
## HEx-PHI Dataset License Agreement
This Agreement contains the terms and conditions that govern your access and use of the HEx-PHI Dataset (as defined above). You may not use the HEx-PHI Dataset if you do not accept this Agreement. By clicking to accept, accessing the HEx-PHI Dataset, or both, you hereby agree to the terms of the Agreement. If you are agreeing to be bound by the Agreement on behalf of your employer or another entity, you represent and warrant that you have full legal authority to bind your employer or such entity to this Agreement. If you do not have the requisite authority, you may not accept the Agreement or access the HEx-PHI Dataset on behalf of your employer or another entity.
* Safety and Moderation: **This dataset contains unsafe conversations or prompts that may be perceived as offensive or unsettling.** Users may not use this dataset for training machine learning models for any harmful purpose. The dataset may not be used to generate content in violation of any law. These prompts should not be used as inputs to models that can generate modalities outside of text (including, but not limited to, images, audio, video, or 3D models)
* Non-Endorsement: The views and opinions depicted in this dataset **do not reflect** the perspectives of the researchers or affiliated institutions engaged in the data collection process.
* Legal Compliance: You are mandated to use it in adherence with all pertinent laws and regulations.
* Model Specific Terms: When leveraging direct outputs of a specific model, users must adhere to its **corresponding terms of use and relevant legal standards**.
* Non-Identification: You **must not** attempt to identify the identities of individuals or infer any sensitive personal data encompassed in this dataset.
* Prohibited Transfers: You **should not** distribute, copy, disclose, assign, sublicense, embed, host, or otherwise transfer the dataset to any third party.
* Right to Request Deletion: At any time, we may require you to delete all copies of this instruction dataset (in whole or in part) in your possession and control. You will promptly comply with any and all such requests. Upon our request, you shall provide us with written confirmation of your compliance with such requirement.
* Termination: We may, at any time, for any reason or for no reason, terminate this Agreement, effective immediately upon notice to you. Upon termination, the license granted to you hereunder will immediately terminate, and you will immediately stop using the HEx-PHI Dataset and destroy all copies of the HEx-PHI Dataset and related materials in your possession or control.
* Limitation of Liability: IN NO EVENT WILL WE BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL, EXEMPLARY, PUNITIVE, SPECIAL, OR INDIRECT DAMAGES (INCLUDING DAMAGES FOR LOSS OF PROFITS, BUSINESS INTERRUPTION, OR LOSS OF INFORMATION) ARISING OUT OF OR RELATING TO THIS AGREEMENT OR ITS SUBJECT MATTER, EVEN IF WE HAVE BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
Subject to your compliance with the terms and conditions of this Agreement, we grant to you, a limited, non-exclusive, non-transferable, non-sublicensable license to use the HEx-PHI Dataset, including the conversation data and annotations, to research, and evaluate software, algorithms, machine learning models, techniques, and technologies for both research and commercial purposes.
## Citation
```
@inproceedings{
anonymous2024finetuning,
title={Fine-tuning Aligned Language Models Compromises Safety, Even When Users Do Not Intend To!},
author={Xiangyu Qi and Yi Zeng and Tinghao Xie and Pin-Yu Chen and Ruoxi Jia and Prateek Mittal and Peter Henderson},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=hTEGyKf0dZ}
}
``` |
roy1109/mygpt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1054429444
num_examples: 395000
download_size: 234273956
dataset_size: 1054429444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
judy93536/benz-peri-52k-torchdata | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 60873521.0
num_examples: 26079
- name: test
num_bytes: 60873521.0
num_examples: 26079
download_size: 70228333
dataset_size: 121747042.0
---
# Dataset Card for "benz-peri-52k-torchdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_104 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1478467616.0
num_examples: 288088
download_size: 1509920546
dataset_size: 1478467616.0
---
# Dataset Card for "chunk_104"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train600_eval300_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 68720
num_examples: 600
- name: train_recite_qa
num_bytes: 453011
num_examples: 600
- name: eval_qa
num_bytes: 35277
num_examples: 300
- name: eval_recite_qa
num_bytes: 226920
num_examples: 300
- name: all_docs
num_bytes: 574063
num_examples: 883
- name: all_docs_eval
num_bytes: 573998
num_examples: 883
- name: train
num_bytes: 453011
num_examples: 600
- name: validation
num_bytes: 226920
num_examples: 300
download_size: 1649745
dataset_size: 2611920
---
# Dataset Card for "lmind_nq_train600_eval300_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EthioNLP/Amharic_LLAMA_MT | ---
language:
- am
- en
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt_header
dtype: string
- name: datasource
dtype: string
splits:
- name: train
num_bytes: 84855653
num_examples: 200000
- name: validation
num_bytes: 1209980
num_examples: 1994
- name: test
num_bytes: 1306100
num_examples: 2024
download_size: 23384531
dataset_size: 87371733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
HK83/Anime_Faces | ---
license: afl-3.0
---
|
liuyanchen1015/MULTI_VALUE_cola_conditional_were_was | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 187
num_examples: 2
- name: train
num_bytes: 465
num_examples: 5
download_size: 4368
dataset_size: 652
---
# Dataset Card for "MULTI_VALUE_cola_conditional_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
QuangDuy/datallms | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Statement
dtype: string
- name: labels
dtype: string
- name: text
dtype: string
splits:
- name: Training
num_bytes: 42144490
num_examples: 5062
- name: Development
num_bytes: 5951142
num_examples: 723
- name: Test
num_bytes: 12078566
num_examples: 1447
download_size: 24842842
dataset_size: 60174198
configs:
- config_name: default
data_files:
- split: Training
path: data/Training-*
- split: Development
path: data/Development-*
- split: Test
path: data/Test-*
---
|
NobodyExistsOnTheInternet/sub4096ctx | ---
license: mit
---
|
autopilot-ai/correct-incorrect-spelling-pairs | ---
license: apache-2.0
task_categories:
- text-classification
- text2text-generation
language:
- gu
pretty_name: spelling pairs
size_categories:
- 100K<n<1M
---
This is a dataset containing correct and incorrect spelling pairs in Gujarati, created by us using artificial noise. |
AyushNayak/149e1671-82ca-438b-bad8-cd41789b5f41 | ---
dataset_info:
features:
- name: input
dtype: int64
- name: instruction
dtype: int64
- name: output
dtype: int64
splits:
- name: train
num_bytes: 96
num_examples: 4
download_size: 1915
dataset_size: 96
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lilyyellow/func_calling | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: instruction
dtype: string
- name: output
struct:
- name: content
dtype: string
- name: end_index
dtype: string
- name: start_index
dtype: string
- name: task
dtype: string
- name: task
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8738565
num_examples: 3240
- name: test
num_bytes: 957111
num_examples: 360
download_size: 1754700
dataset_size: 9695676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
gsstein/50-percent-human-dataset-opt-og | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: generated
dtype: bool
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 85997897
num_examples: 15326
- name: test
num_bytes: 3054170
num_examples: 576
- name: validation
num_bytes: 3250912
num_examples: 576
download_size: 57094399
dataset_size: 92302979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
another-symato/VMTEB-vietnamese_students_feedback_sentiment | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1023776
num_examples: 11426
- name: validation
num_bytes: 136041
num_examples: 1583
- name: test
num_bytes: 282675
num_examples: 3166
download_size: 660439
dataset_size: 1442492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
OdiaGenAI/gpt-teacher-roleplay-odia-3k | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- or
pretty_name: GPT-Teacher-RolePlay-Odia-3K
size_categories:
- 1K<n<10K
---
# Dataset Card for GPT-Teacher-RolePlay-Odia-3K
## Dataset Description
- **Homepage: https://www.odiagenai.org/**
- **Repository: https://github.com/shantipriyap/OdiaGenAI**
- **Point of Contact: Shantipriya Parida, and Sambit Sekhar**
### Dataset Summary
This dataset is the Odia-translated version of the GPT-Teacher-RolePlay 3K instruction set. In this dataset both English and Odia instruction, input, and output strings are available.
### Supported Tasks and Leaderboards
Large Language Model (LLM)
### Languages
Odia
## Dataset Structure
JSON
### Data Fields
instruction (string)
english_instruction (string)
input (string)
english_input (string)
output (string)
english_output (string)
### Licensing Information
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
### Citation Information
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{OdiaGenAI,
author = {Shantipriya Parida and Sambit Sekhar and Subhadarshi Panda and Soumendra Kumar Sahoo and Swateek Jena and Abhijeet Parida and Arghyadeep Sen and Satya Ranjan Dash and Deepak Kumar Pradhan},
title = {OdiaGenAI: Generative AI and LLM Initiative for the Odia Language},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
### Contributions
- Shantipriya Parida
- Sambit Sekhar |
ctoraman/large-scale-hate-speech-v1 | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
language:
- en
tags:
- hate speech
- hate speech detection
- hate-speech
- tweets
- social media
- topic
- hate-speech-detection
---
The dataset published in the LREC 2022 paper "Large-Scale Hate Speech Detection with Cross-Domain Transfer".
# This is Dataset v1:
The original dataset that includes 100,000 tweets in English. The annotations with more than 60% agreement are included.
TweetID: Tweet ID from Twitter API
LangID: 1 (English)
TopicID: Domain of the topic 0-Religion, 1-Gender, 2-Race, 3-Politics, 4-Sports
HateLabel: Final hate label decision 0-Normal, 1-Offensive, 2-Hate
# GitHub Repo:
https://github.com/avaapm/hatespeech
# Citation:
Toraman, C., Şahinuç, F., & Yilmaz, E. (2022, June). Large-Scale Hate Speech Detection with Cross-Domain Transfer. In Proceedings of the Thirteenth Language Resources and Evaluation Conference (pp. 2215-2225). |
audreylorb/filtered_qrels_df | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: query_id
dtype: int64
- name: doc_id
dtype: int64
- name: relevance
dtype: int64
- name: doc_text
dtype: string
- name: query_text
dtype: string
splits:
- name: train
num_bytes: 341011
num_examples: 928
download_size: 159700
dataset_size: 341011
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jsfs11__HighdensityRPMerge-7B | ---
pretty_name: Evaluation run of jsfs11/HighdensityRPMerge-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/HighdensityRPMerge-7B](https://huggingface.co/jsfs11/HighdensityRPMerge-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__HighdensityRPMerge-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T13:31:31.146894](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__HighdensityRPMerge-7B/blob/main/results_2024-03-04T13-31-31.146894.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6497483676912047,\n\
\ \"acc_stderr\": 0.03208497148424546,\n \"acc_norm\": 0.6515040784336489,\n\
\ \"acc_norm_stderr\": 0.032727124352621983,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.01732308859731476,\n \"mc2\": 0.6044359173072787,\n\
\ \"mc2_stderr\": 0.015333928478080957\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756553,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693249\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6756622186815375,\n\
\ \"acc_stderr\": 0.004671701705567244,\n \"acc_norm\": 0.8657637920732921,\n\
\ \"acc_norm_stderr\": 0.0034020920763237414\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337145,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337145\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978093,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978093\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.01642167050633917,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.01642167050633917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.01732308859731476,\n \"mc2\": 0.6044359173072787,\n\
\ \"mc2_stderr\": 0.015333928478080957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7940015785319653,\n \"acc_stderr\": 0.011366474352008825\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.620166793025019,\n \
\ \"acc_stderr\": 0.0133688180969605\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/HighdensityRPMerge-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|arc:challenge|25_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|gsm8k|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hellaswag|10_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T13-31-31.146894.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T13-31-31.146894.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- '**/details_harness|winogrande|5_2024-03-04T13-31-31.146894.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T13-31-31.146894.parquet'
- config_name: results
data_files:
- split: 2024_03_04T13_31_31.146894
path:
- results_2024-03-04T13-31-31.146894.parquet
- split: latest
path:
- results_2024-03-04T13-31-31.146894.parquet
---
# Dataset Card for Evaluation run of jsfs11/HighdensityRPMerge-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/HighdensityRPMerge-7B](https://huggingface.co/jsfs11/HighdensityRPMerge-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__HighdensityRPMerge-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T13:31:31.146894](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__HighdensityRPMerge-7B/blob/main/results_2024-03-04T13-31-31.146894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6497483676912047,
"acc_stderr": 0.03208497148424546,
"acc_norm": 0.6515040784336489,
"acc_norm_stderr": 0.032727124352621983,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.01732308859731476,
"mc2": 0.6044359173072787,
"mc2_stderr": 0.015333928478080957
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756553,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693249
},
"harness|hellaswag|10": {
"acc": 0.6756622186815375,
"acc_stderr": 0.004671701705567244,
"acc_norm": 0.8657637920732921,
"acc_norm_stderr": 0.0034020920763237414
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978093,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978093
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.01642167050633917,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.01642167050633917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.01732308859731476,
"mc2": 0.6044359173072787,
"mc2_stderr": 0.015333928478080957
},
"harness|winogrande|5": {
"acc": 0.7940015785319653,
"acc_stderr": 0.011366474352008825
},
"harness|gsm8k|5": {
"acc": 0.620166793025019,
"acc_stderr": 0.0133688180969605
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/wiki_find_passage_train50_eval20_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 78327
num_examples: 120
- name: validation
num_bytes: 15395
num_examples: 20
download_size: 47982
dataset_size: 93722
---
# Dataset Card for "wiki_find_passage_train50_eval20_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_digitous__Janin-R | ---
pretty_name: Evaluation run of digitous/Janin-R
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [digitous/Janin-R](https://huggingface.co/digitous/Janin-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_digitous__Janin-R\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T03:14:06.115114](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Janin-R/blob/main/results_2023-09-17T03-14-06.115114.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.04803796140939615,\n\
\ \"f1_stderr\": 0.0011624552972241407,\n \"acc\": 0.3381283685172032,\n\
\ \"acc_stderr\": 0.00874019702471766\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n\
\ \"f1\": 0.04803796140939615,\n \"f1_stderr\": 0.0011624552972241407\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.004106620637749676\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685646\n\
\ }\n}\n```"
repo_url: https://huggingface.co/digitous/Janin-R
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T03_14_06.115114
path:
- '**/details_harness|drop|3_2023-09-17T03-14-06.115114.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T03-14-06.115114.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T03_14_06.115114
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-14-06.115114.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-14-06.115114.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:39.251365.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:39.251365.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:29:39.251365.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T03_14_06.115114
path:
- '**/details_harness|winogrande|5_2023-09-17T03-14-06.115114.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T03-14-06.115114.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_29_39.251365
path:
- results_2023-07-19T19:29:39.251365.parquet
- split: 2023_09_17T03_14_06.115114
path:
- results_2023-09-17T03-14-06.115114.parquet
- split: latest
path:
- results_2023-09-17T03-14-06.115114.parquet
---
# Dataset Card for Evaluation run of digitous/Janin-R
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/digitous/Janin-R
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [digitous/Janin-R](https://huggingface.co/digitous/Janin-R) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_digitous__Janin-R",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T03:14:06.115114](https://huggingface.co/datasets/open-llm-leaderboard/details_digitous__Janin-R/blob/main/results_2023-09-17T03-14-06.115114.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.04803796140939615,
"f1_stderr": 0.0011624552972241407,
"acc": 0.3381283685172032,
"acc_stderr": 0.00874019702471766
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.04803796140939615,
"f1_stderr": 0.0011624552972241407
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749676
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685646
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ynt/collabba | ---
license: unknown
---
|
dmayhem93/summarization-sft-heirarchical-train | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: 125M
dtype: string
- name: 1B
dtype: string
- name: 6B
dtype: string
- name: 20B
dtype: string
splits:
- name: train
num_bytes: 238895448
num_examples: 92544
download_size: 74612450
dataset_size: 238895448
---
# Dataset Card for "summarization-sft-heirarchical-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juched/spotifinders-dataset | ---
license: mit
---
|
open-llm-leaderboard/details_kevin009__flyingllama-v2 | ---
pretty_name: Evaluation run of kevin009/flyingllama-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kevin009/flyingllama-v2](https://huggingface.co/kevin009/flyingllama-v2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__flyingllama-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T09:40:33.484186](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama-v2/blob/main/results_2024-02-04T09-40-33.484186.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26355828648354096,\n\
\ \"acc_stderr\": 0.030989295716946252,\n \"acc_norm\": 0.26547327723680664,\n\
\ \"acc_norm_stderr\": 0.031814473208964,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.41299297017962017,\n\
\ \"mc2_stderr\": 0.014938905945440792\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2158703071672355,\n \"acc_stderr\": 0.012022975360030672,\n\
\ \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.32732523401712804,\n\
\ \"acc_stderr\": 0.004682780790508346,\n \"acc_norm\": 0.3843855805616411,\n\
\ \"acc_norm_stderr\": 0.004854555294017559\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517904,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517904\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19574468085106383,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.19574468085106383,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400175,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2903225806451613,\n \"acc_stderr\": 0.025822106119415888,\n \"\
acc_norm\": 0.2903225806451613,\n \"acc_norm_stderr\": 0.025822106119415888\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678241,\n \"\
acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678241\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.034107802518361825,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.034107802518361825\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295895,\n \
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29724770642201837,\n \"acc_stderr\": 0.019595707224643544,\n \"\
acc_norm\": 0.29724770642201837,\n \"acc_norm_stderr\": 0.019595707224643544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n\
\ \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n\
\ \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.16591928251121077,\n\
\ \"acc_stderr\": 0.024967553196547136,\n \"acc_norm\": 0.16591928251121077,\n\
\ \"acc_norm_stderr\": 0.024967553196547136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.042664163633521664,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.042664163633521664\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.027236013946196663,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.027236013946196663\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.022552447780478026,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.022552447780478026\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543346,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543346\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.011371658294311525,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.011371658294311525\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23039215686274508,\n \"acc_stderr\": 0.017035229258034044,\n \
\ \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.017035229258034044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.030021056238440317,\n\
\ \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.030021056238440317\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.41299297017962017,\n\
\ \"mc2_stderr\": 0.014938905945440792\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616438\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/kevin009/flyingllama-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|arc:challenge|25_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|gsm8k|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hellaswag|10_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T09-40-33.484186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- '**/details_harness|winogrande|5_2024-02-04T09-40-33.484186.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T09-40-33.484186.parquet'
- config_name: results
data_files:
- split: 2024_02_04T09_40_33.484186
path:
- results_2024-02-04T09-40-33.484186.parquet
- split: latest
path:
- results_2024-02-04T09-40-33.484186.parquet
---
# Dataset Card for Evaluation run of kevin009/flyingllama-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/flyingllama-v2](https://huggingface.co/kevin009/flyingllama-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__flyingllama-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:40:33.484186](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama-v2/blob/main/results_2024-02-04T09-40-33.484186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26355828648354096,
"acc_stderr": 0.030989295716946252,
"acc_norm": 0.26547327723680664,
"acc_norm_stderr": 0.031814473208964,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.41299297017962017,
"mc2_stderr": 0.014938905945440792
},
"harness|arc:challenge|25": {
"acc": 0.2158703071672355,
"acc_stderr": 0.012022975360030672,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.32732523401712804,
"acc_stderr": 0.004682780790508346,
"acc_norm": 0.3843855805616411,
"acc_norm_stderr": 0.004854555294017559
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517904,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517904
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19574468085106383,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.19574468085106383,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400175,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.025822106119415888,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.025822106119415888
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678241,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678241
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.034107802518361825,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.034107802518361825
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295895,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29724770642201837,
"acc_stderr": 0.019595707224643544,
"acc_norm": 0.29724770642201837,
"acc_norm_stderr": 0.019595707224643544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.16591928251121077,
"acc_stderr": 0.024967553196547136,
"acc_norm": 0.16591928251121077,
"acc_norm_stderr": 0.024967553196547136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.042664163633521664,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.042664163633521664
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.027236013946196663,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.027236013946196663
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855716,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478026,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478026
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.02577001564429039,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.02577001564429039
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311525,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.017035229258034044,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.017035229258034044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.030021056238440317,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.030021056238440317
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.41299297017962017,
"mc2_stderr": 0.014938905945440792
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616438
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rdcoder/flt1strk | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 15578823.0
num_examples: 30
download_size: 15574567
dataset_size: 15578823.0
---
# Dataset Card for "flt1strk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibm/Wish-QA-ELI5-Llama | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: id
dtype: string
- name: old_question
dtype: string
- name: old_answer
dtype: string
- name: passage_1
dtype: string
- name: passage_2
dtype: string
- name: passage_3
dtype: string
- name: text
dtype: string
- name: qa
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: doc_score
dtype: float64
- name: score_qa
dtype: float64
- name: ans_num_words
dtype: int64
- name: text_num_words
dtype: int64
- name: text_longer_1.5
dtype: int64
splits:
- name: train
num_bytes: 49631519
num_examples: 8413
download_size: 29992504
dataset_size: 49631519
---
# Dataset Card for "Wish-QA-ELI5-Llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/noire_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of noire (Fire Emblem)
This is the dataset of noire (Fire Emblem), containing 123 images and their tags.
The core tags of this character are `breasts, short_hair, black_hair, large_breasts, hair_ornament, feather_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 123 | 129.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noire_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 123 | 79.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noire_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 259 | 150.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noire_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 123 | 116.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noire_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 259 | 207.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/noire_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/noire_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, cleavage, feathers, navel, solo, green_bikini, circlet, looking_at_viewer, blush, outdoors, day, blue_eyes, cloud, cowboy_shot, huge_breasts, sky |
| 1 | 13 |  |  |  |  |  | 1girl, solo, circlet, feathers, gloves, cleavage_cutout, arrow_(projectile), open_mouth, quiver, holding_bow_(weapon), medium_breasts, upper_body |
| 2 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, blush, feathers, green_bikini, penis, spread_legs, uncensored, vaginal, bikini_bottom_aside, nipples, solo_focus, blue_eyes, cum_in_pussy, navel, one_eye_closed, open_mouth, sex_from_behind, sweat |
| 3 | 7 |  |  |  |  |  | 1girl, hetero, penis, 1boy, feathers, solo_focus, uncensored, fellatio, blue_eyes, cum, nipples, brown_hair, circlet, nude, open_mouth, paizuri, testicles |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | feathers | navel | solo | green_bikini | circlet | looking_at_viewer | blush | outdoors | day | blue_eyes | cloud | cowboy_shot | huge_breasts | sky | gloves | cleavage_cutout | arrow_(projectile) | open_mouth | quiver | holding_bow_(weapon) | medium_breasts | upper_body | 1boy | hetero | penis | spread_legs | uncensored | vaginal | bikini_bottom_aside | nipples | solo_focus | cum_in_pussy | one_eye_closed | sex_from_behind | sweat | fellatio | cum | brown_hair | nude | paizuri | testicles |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-----------|:--------|:-------|:---------------|:----------|:--------------------|:--------|:-----------|:------|:------------|:--------|:--------------|:---------------|:------|:---------|:------------------|:---------------------|:-------------|:---------|:-----------------------|:-----------------|:-------------|:-------|:---------|:--------|:--------------|:-------------|:----------|:----------------------|:----------|:-------------|:---------------|:-----------------|:------------------|:--------|:-----------|:------|:-------------|:-------|:----------|:------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | | X | | | X | | | X | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | | | | X | | | | | X | | | | | | | | X | | | | | X | X | X | | X | | | X | X | | | | | X | X | X | X | X | X |
|
CyberHarem/100_shiki_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 100_shiki/一〇〇式/樱花 (Girls' Frontline)
This is the dataset of 100_shiki/一〇〇式/樱花 (Girls' Frontline), containing 114 images and their tags.
The core tags of this character are `long_hair, black_hair, bangs, hair_ornament, red_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 160.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/100_shiki_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 114 | 83.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/100_shiki_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 285 | 181.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/100_shiki_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 114 | 138.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/100_shiki_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 285 | 268.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/100_shiki_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/100_shiki_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, long_sleeves, red_scarf, solo, black_shirt, looking_at_viewer, red_neckerchief, white_background, black_serafuku, black_skirt, simple_background, black_sailor_collar, blush, pleated_skirt, hand_up, black_pantyhose, blunt_bangs, closed_mouth, hair_flower, cherry_blossom_print, smile |
| 1 | 11 |  |  |  |  |  | 1girl, long_sleeves, pleated_skirt, red_scarf, black_skirt, looking_at_viewer, solo, black_pantyhose, submachine_gun, black_serafuku, blush, closed_mouth, holding_gun, jacket, white_background, cherry_blossoms, red_neckerchief, simple_background, bayonet, belt, black_shirt, flower, rifle, sailor_collar |
| 2 | 12 |  |  |  |  |  | 1girl, miko, solo, red_scarf, hakama_skirt, looking_at_viewer, red_hakama, wide_sleeves, blush, official_alternate_costume, long_sleeves, submachine_gun, white_kimono, bird, closed_mouth, holding_broom, ribbon-trimmed_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | red_scarf | solo | black_shirt | looking_at_viewer | red_neckerchief | white_background | black_serafuku | black_skirt | simple_background | black_sailor_collar | blush | pleated_skirt | hand_up | black_pantyhose | blunt_bangs | closed_mouth | hair_flower | cherry_blossom_print | smile | submachine_gun | holding_gun | jacket | cherry_blossoms | bayonet | belt | flower | rifle | sailor_collar | miko | hakama_skirt | red_hakama | wide_sleeves | official_alternate_costume | white_kimono | bird | holding_broom | ribbon-trimmed_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------|:-------|:--------------|:--------------------|:------------------|:-------------------|:-----------------|:--------------|:--------------------|:----------------------|:--------|:----------------|:----------|:------------------|:--------------|:---------------|:--------------|:-----------------------|:--------|:-----------------|:--------------|:---------|:------------------|:----------|:-------|:---------|:--------|:----------------|:-------|:---------------|:-------------|:---------------|:-----------------------------|:---------------|:-------|:----------------|:-------------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | X | | | | | | | X | | | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
bdsaglam/musique-answerable-2hop-subset-jerx-reward-openai | ---
dataset_info:
features:
- name: id
dtype: string
- name: jerx.input
dtype: string
- name: jerx.output
dtype: string
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 214510
num_examples: 110
download_size: 0
dataset_size: 214510
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ManeAI31416/NASA_fine-tuning | ---
license: llama2
---
|
thanhduycao/viet_news_all_topics_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9527617940
num_examples: 2683782
download_size: 4723623208
dataset_size: 9527617940
---
# Dataset Card for "viet_news_all_topics_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
breadlicker45/rlhf-prompt2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 48316417
num_examples: 36943
download_size: 4007730
dataset_size: 48316417
---
# Dataset Card for "rlhf-prompt2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713000375 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 79896
num_examples: 209
download_size: 36260
dataset_size: 79896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
faranheit/ministries | ---
license: apache-2.0
task_categories:
- question-answering
language:
- ar
tags:
- not-for-all-audiences
---
{"id": "130042945016-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"}
{"id": "130042945016-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022 \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u062e\u0635 \u0627\u0644\u0637\u0628\u064a\u0639\u064a \u0623\u0643\u0628\u0631 \u0645\u0646 18 \u0639\u0627\u0645\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0642\u0627\u0635\u0631\u064b\u0627 \u064a\u062a\u0645 \u0625\u0631\u0641\u0627\u0642 \u0635\u0643 \u0627\u0644\u0648\u0644\u0627\u064a\u0629 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"}
{"id": "130042945016-2", "text": "\u2022 \u064a\u062c\u0628 \u0623\u0646 \u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0645\u0648\u0638\u0641\u064a\u0646 \u062d\u0643\u0648\u0645\u064a\u064a\u0646 . \n \u2022 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0642\u0627\u0639\u062f\u0629 \u0627\u0644\u0639\u0645\u0644 \u0627\u0644\u062e\u0627\u0635\u0629 \u0628\u0628\u0639\u0636 \u0639\u0648\u0627\u0626\u0644 \u0646\u062c\u0631\u0627\u0646 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0634\u0631\u064a\u0643 \u0625\u0639\u062a\u0628\u0627\u0631\u064a \u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u063a\u064a\u0631 \u0645\u0634\u0637\u0648\u0628 \u0623\u0648 \u0645\u0648\u0642\u0648\u0641 \u0623\u0648 \u0645\u0646\u062a\u0647\u064a . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0623\u062c\u0646\u0628\u064a \u064a\u062c\u0628 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0648\u062c\u0648\u062f \u0631\u062e\u0635\u0629 \u0625\u0633\u062a\u062b\u0645\u0627\u0631 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631 . \n \u2022 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u062c\u0647\u0629 \u062d\u0643\u0648\u0645\u064a\u0629/\u0645\u0624\u0633\u0633\u0629 \u0623\u0647\u0644\u064a\u0629/\u062c\u0645\u0639\u064a\u0629 \u062e\u064a\u0631\u064a\u0629/ \u0648\u0642\u0641 \"\u064a\u062c\u0628 \u0648\u062c\u0648\u062f \u0633\u0646\u062f \u0646\u0638\u0627\u0645\u064a \u064a\u062e\u0648\u0644\u0647\u0627 \u0628\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0623\u0648 \u0627\u0644\u0645\u0634\u0627\u0631\u0643\u0629 \u0641\u064a \u0634\u0631\u0643\u0629 \". \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0627\u0644\u0645\u0633\u062a\u0646\u062f\u0627\u062a \u0627\u0644\u0645\u0637\u0644\u0648\u0628\u0629 : \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0627\u0644\u0628\u0646\u0643 \u0627\u0644\u0645\u0631\u0643\u0632\u064a \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u0646\u0634\u0627\u0637 \u064a\u062a\u0637\u0644\u0628 \u0630\u0644\u0643.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"}
{"id": "130042945016-3", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"}
{"id": "130042945016-4", "text": "\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=5 \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=3 \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=4/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05671c"}
{"id": "952da374a2f2-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u064f\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0628\u0645\u0648\u062c\u0628 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062a\u0645 \u0627\u0644\u062a\u062d\u0642\u0651\u064f\u0642 \u0645\u0646 \u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629 \u0644\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u0629\u060c \u0628\u0627\u0644\u0625\u0636\u0627\u0641\u0629 \u0625\u0644\u0649 \u0627\u0644\u062a\u0627\u0644\u064a: \n \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629. \n \u2022 \u0623\u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022 \u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"}
{"id": "952da374a2f2-1", "text": "\u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0639\u0642\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644 ,\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u0645\u0647\u0646\u064a\u0629 . \n 4\u2022\u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"}
{"id": "952da374a2f2-2", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 + 15% \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=3 \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=5 \n \u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=4/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056720"}
{"id": "fd131aa6e7d9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"}
{"id": "fd131aa6e7d9-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"}
{"id": "fd131aa6e7d9-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%. \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0648 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"}
{"id": "fd131aa6e7d9-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0648\u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 : 15%.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143 \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0623\u062c\u0646\u0628\u064a\u0629 /n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056721"}
{"id": "3742980811a3-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062c\u0645\u064a\u0639 \u0637\u0644\u0628\u0627\u062a \u0639\u0642\u0648\u062f \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u062d\u0633\u0628 \u0627\u0644\u0643\u064a\u0627\u0646 : \n \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u064f\u062a\u0645\u0651\u0643\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u064f\u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a\u060c \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0645\u062d\u062f\u0648\u062f\u0629 \u0645\u0646 \u0634\u062e\u0635 \n \u0648\u0627\u062d\u062f\u060c \u0623\u0648 \u0623\u0643\u062b\u0631 \u0645\u0646 \u0630\u0648\u064a \u0627\u0644\u0635\u0641\u0629 \u0627\u0644\u0637\u0628\u064a\u0639\u064a\u0629 \u0623\u0648 \u0627\u0644\u0625\u0639\u062a\u0628\u0627\u0631\u064a\u0629\u060c \u0648\u062a\u0639\u062f \u0630\u0645\u062a\u0647\u0627 \u0645\u0633\u062a\u0642\u0644\u0629 \u0639\u0646 \u0627\u0644\u0630\u0645\u0629 \u0627\u0644\u0645\u0627\u0644\u064a\u0629 \u0644\u0643\u0644 \u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627. \n \u0648\u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u062d\u062f\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0629 \u0639\u0646 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0645\u062a\u0631\u062a\u0628\u0629 \u0639\u0644\u064a\u0647\u0627 \u0623\u0648 \u0627\u0644\u0646\u0627\u0634\u0626\u0629 \u0639\u0646 \u0646\u0634\u0627\u0637\u0647\u0627\u060c \u0648\u0644\u0627 \u064a\u0643\u0648\u0646 \u0627\u0644\u0645\u0627\u0644\u0643 \u0644\u0647\u0627 \u0648\u0644\u0627 \n \u0627\u0644\u0634\u0631\u064a\u0643 \u0641\u064a\u0647\u0627 \u0645\u0633\u0624\u0648\u0644\u0622\u064b \u0639\u0646 \u0647\u0630\u0647 \u0627\u0644\u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u0625\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0625\u0644\u0627 \u0628\u0642\u062f\u0631 \u062d\u0635\u062a\u0647 \u0641\u064a \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u0627\u0644\u062a\u0636\u0627\u0645\u0646 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"}
{"id": "3742980811a3-1", "text": "\u0627\u0644\u062a\u0636\u0627\u0645\u0646 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0636\u0627\u0645\u0646\u064a\u0629 \u062c\u0645\u064a\u0639 \u0634\u0631\u0643\u0627\u0624\u0647\u0627 \u0623\u0641\u0631\u0627\u062f \n \u0648\u0645\u0633\u0624\u0648\u0644\u064a\u0646 \u0634\u062e\u0635\u064a\u0627\u064b \u0628\u062c\u0645\u064a\u0639 \u0623\u0645\u0648\u0627\u0644\u0647\u0645 \u0648\u0628\u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a\u0647\u0627\u060c \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 : \n \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u064f\u0645\u0643\u0651\u0646 \u0627\u0644\u0645\u0633\u062a\u062b\u0645\u0631 \u0645\u0646 \u0627\u0644\u0628\u062f\u0621 \u0641\u064a \u0645\u0645\u0627\u0631\u0633\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u062a\u0623\u0633\u064a\u0633 \u0634\u0631\u0643\u0629 \u062a\u0648\u0635\u064a\u0629 \u0628\u0633\u064a\u0637\u0629 \u062a\u062a\u0643\u0648\u0646 \u0645\u0646 \u0641\u0631\u064a\u0642\u064a\u0646\u060c \n (\u0627\u0644\u0645\u062a\u0636\u0627\u0645\u0646) \u0648\u0647\u0648 \u0627\u0644\u0645\u0633\u0624\u0648\u0644 \u0639\u0646 \u062f\u064a\u0648\u0646 \u0648\u0627\u0644\u062a\u0632\u0627\u0645\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0629\u060c \u0648(\u0627\u0644\u0645\u0648\u0635\u064a) \u0648\u0647\u0648 \u0627\u0644\u0630\u064a \u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0633\u0624\u0648\u0644\u0627\u064b \u0625\u0644\u0627 \u0641\u064a \u062d\u062f\u0648\u062f \u062d\u0635\u062a\u0647 \u0641\u064a \n \u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0631\u062e\u064a\u0635 \u0645\u0646 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0627\u0633\u062a\u062b\u0645\u0627\u0631. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"}
{"id": "3742980811a3-2", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0634\u0631\u0643\u0629 \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629 - \u0627\u0644\u062a\u0636\u0627\u0645\u0646 - \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 ) . \u0645\u062e\u062a\u0644\u0637\u0629 . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629: 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646: 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"}
{"id": "3742980811a3-3", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u0644\u0644\u0643\u064a\u0627\u0646\u0627\u062a: 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143 \n \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 : (\u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646\u064a\u0629-\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629) \u060c \u0645\u062e\u062a\u0644\u0637\u0629 /n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056722"}
{"id": "d1ca31729b6e-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022 \u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022 \u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"}
{"id": "d1ca31729b6e-1", "text": "3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"}
{"id": "d1ca31729b6e-2", "text": "17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a . \n \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056723"}
{"id": "a087ade56ad1-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648\u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629: \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u0623\u0648 \u0635\u0643 \u0648\u0631\u062b\u0629 \u0623\u0648 \u0648\u062c\u0648\u062f \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u062a\u0643\u0648\u0646 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0645\u0646 \u062e\u0644\u0627\u0644 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"}
{"id": "a087ade56ad1-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"}
{"id": "a087ade56ad1-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0623\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a..\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672b"}
{"id": "9a1a18440fc9-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648\u0627\u0644\u062c\u0645\u0627\u0631\u0643. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0639\u062f\u0644. \n \u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"}
{"id": "9a1a18440fc9-1", "text": "\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 : \n 13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC).", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"}
{"id": "9a1a18440fc9-2", "text": "14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f : \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a : \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672c"}
{"id": "503dc6645068-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0623\u0633\u064a\u0633 \u0627\u0644\u0634\u0631\u0643\u0627\u062a \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0627\u0644\u0635\u0627\u062f\u0631\u0629 \u0628\u0645\u0648\u062c\u0628 \u062a\u0631\u062e\u064a\u0635 \u0645\u0647\u0646\u064a \u0648 \u0645\u0632\u0627\u0648\u0644\u0629 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0628\u0623\u0646\u0648\u0627\u0639\u0647\u0627: (\u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629- \u0627\u0644\u062a\u0636\u0627\u0645\u0646 \u2013 \u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629)\u00a0 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u064a\u062c\u0628 \u0623\u0646 \u062a\u062a\u0648\u0641\u0631 \u0631\u062e\u0635\u0629 \u0645\u0647\u0646\u064a\u0629 \u0633\u0627\u0631\u064a\u0629 \u0644\u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0648\u0642\u062a \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0623\u062d\u062f \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u063a\u064a\u0631 \u0633\u0639\u0648\u062f\u064a\u061b \u064a\u062c\u0628 \u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0645\u0647\u0646\u064a \u0627\u0644\u0635\u0627\u062f\u0631 \u0645\u0646 \u062f\u0627\u062e\u0644 \u0627\u0644\u0645\u0645\u0644\u0643\u0629. \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u064a\u0643 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635 \u0639\u0646 (25%) \u0645\u0646 \u0631\u0623\u0633 \u0645\u0627\u0644 \u0627\u0644\u0634\u0631\u0643\u0629 \u0627\u0644\u0645\u0647\u0646\u064a\u0629 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u062e\u062a\u0644\u0637\u0629 . \n \u2022\u0623\u0646 \u0644\u0627 \u062a\u0642\u0644 \u0646\u0633\u0628\u0629 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0645\u0631\u062e\u0651\u064e\u0635\u064a\u0646 \u0639\u0646 (70%) \u0644\u0633\u0639\u0648\u062f\u064a \u0648\u0627\u0644\u062e\u0644\u064a\u062c\u064a . \n \u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"}
{"id": "503dc6645068-1", "text": "\u2022\u0648\u062c\u0648\u062f \u062a\u0631\u062e\u064a\u0635 \u0627\u0633\u062a\u062b\u0645\u0627\u0631\u064a . \n \u2022\u0625\u0631\u0641\u0627\u0642 \u062a\u0642\u0631\u064a\u0631 \u0627\u0644\u0645\u0642\u064a\u0645 \u0627\u0644\u0645\u0639\u062a\u0645\u062f \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0631\u0627\u0633 \u0627\u0644\u0645\u0627\u0644 \u0639\u064a\u0646\u064a. \n \u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n \u2022\u0623\u0646\u0648\u0627\u0639 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0641\u064a \u0627\u0644\u0639\u0642\u062f : \n \u2022\u0634\u0631\u064a\u0643 \u0645\u0631\u062e\u0635. \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0631\u0623\u0633 \u0627\u0644\u0645\u0627\u0644 . \n \u2022\u0634\u0631\u064a\u0643 \u0628\u0627\u0644\u0639\u0645\u0644.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC) . \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u0646\u0648\u0639 \u0648\u0635\u0641\u0629 \u0627\u0644\u0634\u0631\u0643\u0629 ( \u0645\u0647\u0646\u064a\u0629 ) . \n 4\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 5\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 6\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022\u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0627\u0637\u0631\u0627\u0641 . \n 8\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0631\u0629 . \n 9\u2022\u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0648\u062b\u0627\u0626\u0642 \u0648\u0637\u0628\u0627\u0639\u0629 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0648\u0639\u0642\u062f \u0627\u0644\u0634\u0631\u0643\u0629 \u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627 .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"}
{"id": "503dc6645068-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0627\u0644\u0645\u0647\u0646\u064a\u0629 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629) 1200 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629 \u0648\u0627\u0644\u062a\u0636\u0627\u0645\u0646) 800 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0627\u0644\u0645\u0647\u0646\u064a\u0629 (\u0627\u0644\u0645\u0633\u0627\u0647\u0645\u0629) 1600 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a. \n \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 : 500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u064a\u0636\u0627\u0641 \u0639\u0644\u064a\u0647\u0627 \u0642\u064a\u0645\u0629 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a : \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=143/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672d"}
{"id": "03cfa28e4c27-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u0642\u062f\u064a\u0645 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u0648\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 \u0644\u0644\u0634\u0631\u0643\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u0641\u064a \u062d\u0627\u0644 \u062a\u0639\u062f\u064a\u0644 \u0645\u062c\u0644\u0633 \u0627\u0644\u0645\u062f\u064a\u0631\u064a\u0646 \u0623\u0648 \u0627\u0644\u0625\u062f\u0627\u0631\u0629 \u064a\u062c\u0628 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u0646\u0635\u0627\u0628 \u0627\u0644\u0642\u0627\u0646\u0648\u0646\u064a . \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0628\u062f\u062e\u0648\u0644 \u0634\u0631\u064a\u0643 \u0645\u0647\u0646\u064a \u064a\u062c\u0628 \u0623\u0646 \u064a\u0643\u0648\u0646 \u0645\u0631\u062e\u0651\u064e\u0635 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0647\u064a\u0626\u0629 \u0627\u0644\u0632\u0643\u0627\u0629 \u0648 \u0627\u0644\u0636\u0631\u064a\u0628\u0629 \u0648 \u0627\u0644\u062c\u0645\u0627\u0631\u0643 . \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0623\u0644\u0627 \u062a\u0643\u0648\u0646 \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0648\u0642\u0648\u0641\u0629 \u0645\u0646 \u0642\u0628\u0644 \u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629. \n \u2022\u0641\u064a \u062d\u0627\u0644\u0629 \u0625\u0636\u0627\u0641\u0629 / \u062d\u0630\u0641 \u0628\u064a\u0627\u0646\u0627\u062a \u0634\u0631\u064a\u0643 \u0628\u0646\u0627\u0621\u064b \u0639\u0644\u0649 \u062d\u0643\u0645 \u0642\u0636\u0627\u0626\u064a \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644\u060c \u0648\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646 \u0635\u0643 \u0648\u0631\u062b\u0629 \u062a\u0643\u0648\u0646 \u0639\u0628\u0631 \u0645\u064f\u0639\u062a\u0645\u062f \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0623\u0648 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644. \n \u2022\u0641\u064a \u062d\u0627\u0644 \u0625\u0636\u0627\u0641\u0629 \u0634\u0631\u064a\u0643 \u0642\u0627\u0635\u0631 \u064a\u062a\u0645 \u0627\u0644\u0645\u0648\u0627\u0641\u0642\u0629 \u0639\u0628\u0631 \u0643\u0627\u062a\u0628 \u0627\u0644\u0639\u062f\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"}
{"id": "03cfa28e4c27-1", "text": "\u2022\u0625\u0631\u0641\u0627\u0642 \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0627\u0644\u0625\u0633\u062a\u062b\u0645\u0627\u0631\u064a \u0645\u0639 \u0645\u0631\u0627\u0639\u0627\u0629 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0646\u0634\u0627\u0637 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0641\u064a \u0627\u0644\u062a\u0631\u062e\u064a\u0635 \u0623\u0648\u0644\u0627\u064b.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 2\u2022 \u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0628\u0648\u0627\u0633\u0637\u0629 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 3\u2022 \u062a\u062d\u062f\u064a\u062f \u062e\u062f\u0645\u0629 \u062a\u0639\u062f\u064a\u0644 \u0639\u0642\u062f \u0634\u0631\u0643\u0629 . \n 4\u2022 \u062a\u062d\u062f\u064a\u062f \u0633\u0628\u0628 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 . \n 5\u2022 \u0625\u0633\u062a\u0643\u0645\u0627\u0644 \u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0646\u0645\u0648\u0630\u062c \u0627\u0644\u062a\u0642\u062f\u064a\u0645 . \n 6\u2022 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u0637\u0644\u0628 . \n 7\u2022 \u0645\u0639\u0627\u0644\u062c\u0629 \u0627\u0644\u0637\u0644\u0628 . \n 8\u2022 \u0645\u0648\u0627\u0641\u0642\u0629 \u0627\u0644\u0623\u0637\u0631\u0627\u0641 . \n 9\u2022 \u0633\u062f\u0627\u062f \u0627\u0644\u0641\u0627\u062a\u0648\u0629 . \n 10\u2022 \u062a\u062d\u062f\u064a\u062b \u0627\u0644\u0648\u062b\u0627\u0626\u0642 (\u0646\u0638\u0627\u0645 \u0627\u0644\u0623\u0633\u0627\u0633/\u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) . \n 11\u2022\u0645\u0644\u0627\u062d\u0638\u0629 : \n 12\u2022\u0628\u0625\u0645\u0643\u0627\u0646 \u0627\u0644\u0639\u0645\u064a\u0644 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u062d\u0627\u0644\u0629 \u0627\u0644\u0637\u0644\u0628 ( \u0628\u0625\u0646\u062a\u0638\u0627\u0631 \u0627\u0644\u062a\u062d\u0642\u0642 \u0645\u0646 \u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a ) \u0628\u0625\u062a\u0628\u0627\u0639 \u0627\u0644\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062a\u0627\u0644\u064a\u0647 :", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"}
{"id": "03cfa28e4c27-2", "text": "13\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0645\u0646\u0635\u0629 \u0627\u0644\u0645\u0631\u0643\u0632 \u0627\u0644\u0633\u0639\u0648\u062f\u064a \u0644\u0644\u0623\u0639\u0645\u0627\u0644 (SBC). \n 14\u2022\u062a\u0633\u062c\u064a\u0644 \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0628\u0631 \u0627\u0644\u0646\u0641\u0627\u0630 \u0627\u0644\u0648\u0637\u0646\u064a \u0627\u0644\u0645\u0648\u062d\u062f \u0623\u0648 \u0627\u0644\u0628\u0631\u064a\u062f \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a . \n 15\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0637\u0644\u0628\u0627\u062a\u064a . \n 16\u2022\u0625\u0636\u0627\u0641\u0629 \u0631\u0642\u0645 \u0627\u0644\u0637\u0644\u0628 \u0641\u064a \u0625\u064a\u0642\u0648\u0646\u0629 (\u0628\u062d\u062b \u0645\u062a\u0642\u062f\u0645 ) \u0641\u064a \u0623\u0639\u0644\u0649 \u0627\u0644\u0635\u0641\u062d\u0629 . \n 17\u2022\u0628\u0639\u062f \u0638\u0647\u0648\u0631 \u0627\u0644\u0637\u0644\u0628 \u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0623\u064a\u0642\u0648\u0646\u0629 \u0625\u062c\u0631\u0627\u0621\u0627\u062a \u0648\u0627\u062e\u062a\u064a\u0627\u0631 ( \u062a\u0641\u0627\u0635\u064a\u0644 ) . \n 18\u2022\u0627\u0644\u0636\u063a\u0637 \u0639\u0644\u0649 \u0625\u0644\u063a\u0627\u0621 \u0627\u0644\u0637\u0644\u0628 .\n\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: \u0631\u0633\u0648\u0645 \u0646\u0634\u0631 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0639\u0642\u062f: \n 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a\u060c \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% . \n \u0631\u0633\u0648\u0645 \u0627\u0644\u062a\u0639\u062f\u064a\u0644 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a: \n 100 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a.\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a: \n https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"}
{"id": "03cfa28e4c27-3", "text": "https://business.sa/ServicesAndPrograms/ServicesDetails.html?ServiceID=140 \n * \u0639\u0646\u062f \u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629 \u062a\u0638\u0647\u0631 \u0644\u0644\u0639\u0645\u064a\u0644 \u0646\u0648\u0639 \u0627\u0644\u0643\u064a\u0627\u0646 \u0627\u0644\u0645\u0637\u0644\u0648\u0628 \u0639\u0646\u062f \u0627\u0644\u062a\u0623\u0633\u064a\u0633 ( \u0630\u0627\u062a \u0627\u0644\u0645\u0633\u0624\u0648\u0644\u064a\u0629 \u0627\u0644\u0645\u062d\u062f\u0648\u062f\u0629-\u0627\u0644\u062a\u0636\u0627\u0645\u0646-\u0627\u0644\u062a\u0648\u0635\u064a\u0629 \u0627\u0644\u0628\u0633\u064a\u0637\u0629- \u0627\u0644\u0645\u0647\u0646\u064a\u0629 )/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a05672e"}
{"id": "196a7f6e6203-0", "text": "\u0648\u0635\u0641 \u0627\u0644\u062e\u062f\u0645\u0629: \u062a\u0645\u0643\u0651\u0650\u0646 \u0647\u0630\u0647 \u0627\u0644\u062e\u062f\u0645\u0629 \u0627\u0644\u0639\u0645\u064a\u0644 \u0645\u0646 \u062a\u062d\u0648\u064a\u0644 ( \u0646\u0648\u0639 \u0627\u0644\u0645\u0646\u0634\u0623\u0629 ) \u0645\u0646 \u0634\u0631\u0643\u0629 \u0625\u0644\u0649 \u0645\u0624\u0633\u0633\u0629 .\n\u0634\u0631\u0648\u0637 \u0648\u0645\u062a\u0637\u0644\u0628\u0627\u062a \u0627\u0644\u062e\u062f\u0645\u0629: \u2022\u062a\u0642\u062f\u064a\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629 . \n \u2022\u0623\u0644\u0627 \u064a\u0642\u0644 \u0639\u0645\u0631 \u0645\u0642\u062f\u0645 \u0627\u0644\u0637\u0644\u0628 \u0639\u0646 18 \u0633\u0646\u0629. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0645\u0648\u0638\u0641 \u062d\u0643\u0648\u0645\u064a. \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0644\u062f\u0649 \u0645\u0627\u0644\u0643 \u0627\u0644\u0645\u0624\u0633\u0633\u0629 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a \u0642\u0627\u0626\u0645 (\u0645\u0624\u0633\u0633\u0629 \u0642\u0627\u0626\u0645\u0629). \n \u2022\u0623\u0644\u0627 \u064a\u0643\u0648\u0646 \u0639\u0644\u0649 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a \u0627\u0644\u0645\u0631\u0627\u062f \u062a\u062d\u0648\u064a\u0644\u0647 \u0623\u064a \u0625\u064a\u0642\u0627\u0641. \n \u2022\u0623\u0644\u0627\u064a\u0643\u0648\u0646 \u0627\u0644\u0633\u062c\u0644 \u0642\u0627\u0626\u0645 ( \u063a\u064a\u0631 \u0645\u0646\u062a\u0647\u064a ) . \n \u2022\u0639\u062f\u0645 \u0648\u062c\u0648\u062f \u0637\u0644\u0628\u0627\u062a \u0645\u0639\u0644\u0642\u0629 \u0639\u0644\u0649 \u0646\u0641\u0633 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a .", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"}
{"id": "196a7f6e6203-1", "text": "\u2022\u0641\u064a \u062d\u0627\u0644 \u0643\u0627\u0646\u062a \u0627\u0644\u0634\u0631\u0643\u0629 \u0645\u0633\u0627\u0647\u0645\u0629 \u0623\u0648 \u0645\u0633\u0627\u0647\u0645\u0629 \u0645\u0628\u0633\u0637\u0629 \u064a\u062a\u0645 \u062a\u0642\u062f\u064a\u0645 \u0627\u0644\u062e\u062f\u0645\u0629 \u0639\u0646 \u0637\u0631\u064a\u0642 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0645\u0639 \u062a\u0636\u0645\u064a\u0646 \u0625\u0631\u0641\u0627\u0642 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0629 \u0628\u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0628\u0639\u062f \u0646\u0634\u0631\u0647 \u0639\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631 \u0627\u0644\u062c\u0645\u0639\u064a\u0627\u062a \u0627\u0644\u063a\u064a\u0631 \u0639\u0627\u062f\u064a\u0629.\n\u062e\u0637\u0648\u0627\u062a \u0627\u0644\u062d\u0635\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u062e\u062f\u0645\u0629: 1\u2022 \u0627\u0644\u062f\u062e\u0648\u0644 \u0625\u0644\u0649 \u0646\u0638\u0627\u0645 \u0642\u0631\u0627\u0631\u0627\u062a \u0627\u0644\u0634\u0631\u0643\u0627\u0621 \u0627\u0644\u0633\u0627\u0628\u0642 \u0642\u064a \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0648\u0632\u0627\u0631\u0629 \u0627\u0644\u062a\u062c\u0627\u0631\u0629\u060c \u0648 \u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u062a\u062d\u0648\u064a\u0644\u060c \u062b\u0645 \u062a\u062a\u0645 \u062f\u0631\u0627\u0633\u0629 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 2\u2022\u0625\u0635\u062f\u0627\u0631 \u0641\u0627\u062a\u0648\u0631\u0629 \u0633\u062f\u0627\u062f . \n 3\u2022\u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u0648 \u062a\u0648\u062b\u064a\u0642 \u0627\u0644\u0637\u0644\u0628 \u0644\u062f\u0649 \u0645\u0648\u0638\u0641 \u0627\u0644\u0648\u0632\u0627\u0631\u0629 \u0648\u0646\u0634\u0631 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 \u0627\u0644\u0643\u062a\u0631\u0648\u0646\u064a\u064b\u0627. \n 4\u2022\u0627\u0644\u062f\u062e\u0648\u0644 \u0639\u0644\u0649 \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a \u0648 \u062a\u0639\u0628\u0626\u0629 \u0646\u0645\u0648\u0630\u062c \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0633\u062c\u0644 \u062a\u062c\u0627\u0631\u064a. \n 5\u2022\u0625\u0631\u0633\u0627\u0644 \u0627\u0644\u0637\u0644\u0628 \u0648 \u0627\u0639\u062a\u0645\u0627\u062f\u0647. \n 6\u2022\u0628\u0639\u062f \u0633\u062f\u0627\u062f \u0627\u0644\u0631\u0633\u0648\u0645 \u064a\u062a\u0645 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644.", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"}
{"id": "196a7f6e6203-2", "text": "\u0631\u0633\u0648\u0645 \u0627\u0644\u062e\u062f\u0645\u0629: 1500 \u0631\u064a\u0627\u0644 \u0633\u0639\u0648\u062f\u064a \u0631\u0633\u0648\u0645 \u0627\u0644\u0646\u0634\u0631 \u064a\u0636\u0627\u0641 \u0625\u0644\u064a\u0647\u0627 \u0636\u0631\u064a\u0628\u0629 \u0627\u0644\u0642\u064a\u0645\u0629 \u0627\u0644\u0645\u0636\u0627\u0641\u0629 15% .\n\u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0645\u0639\u0644\u0648\u0645\u0627\u062a \u0623\u062e\u0631\u0649: \u0627\u0644\u0645\u0648\u0642\u0639 \u0627\u0644\u0625\u0644\u0643\u062a\u0631\u0648\u0646\u064a \u0644\u0644\u062e\u062f\u0645\u0629 : \n \u0642\u0631\u0627\u0631 \u0627\u0644\u0634\u0631\u0643\u0627\u0621 ( \u0644\u062a\u0642\u062f\u064a\u0645 \u0637\u0644\u0628 \u0627\u0644\u062a\u062d\u0648\u064a\u0644 ) \n https://mc.gov.sa/ar/eservices/Pages/ServiceDetails.aspx?sID=74 \n \u0627\u0644\u0641\u0631\u0639 \u0627\u0644\u0631\u0642\u0645\u064a ( \u0637\u0644\u0628 \u0625\u0635\u062f\u0627\u0631 \u0627\u0644\u0633\u062c\u0644 \u0627\u0644\u062a\u062c\u0627\u0631\u064a ) \n https://mc.gov.sa/ar/eservices/Pages/ServiceDetails.aspx?sID=29/n", "source": "https://kgate.bc.gov.sa/#/service/64087e96d8467ecd6a056731"} |
AntoineBlanot/snli-contrast | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: instruction
dtype: string
- name: label_name
dtype: string
splits:
- name: train
num_bytes: 283196540
num_examples: 1098734
- name: test
num_bytes: 5199496
num_examples: 19684
download_size: 23437414
dataset_size: 288396036
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "snli-contrast"
This dataset is the [snli-3way](https://huggingface.co/datasets/AntoineBlanot/snli-3way) dataset with an additional `instruction` feature.
This new feature along with its related `label_name` expresses how the `premise` and `hypothesis` features are related in the original dataset.
The following explains how the mapping is done:
### If the original example was of class `entailment`
Two data points will be related to that example.
One is the positive example (i.e., `label_name` == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise."
The other is the negative example (i.e., `label_name` == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise."
### If the original example was of class `contradiction` or `neutral`
Two data points will be related to that example.
One is the positive example (i.e., `label_name` == "positive") which assign to it the folowing instruction: "The meaning of the hypothesis either contradicts the meaning of the premise, is unrelated to it, or does not provide sufficient information to infer the meaning of the premise."
The other is the negative example (i.e., `label_name` == "negative") which assign to it the folowing instruction: "The meaning of the hypothesis is logically inferred from the meaning of the premise."
This dataset is double the size of this original dataset because each is related to a positive and negative instruction.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1 | ---
pretty_name: Evaluation run of augmxnt/shisa-base-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [augmxnt/shisa-base-7b-v1](https://huggingface.co/augmxnt/shisa-base-7b-v1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T16:05:53.719253](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1/blob/main/results_2023-12-09T16-05-53.719253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2521543269268176,\n\
\ \"acc_stderr\": 0.0301585291688336,\n \"acc_norm\": 0.24535569023474865,\n\
\ \"acc_norm_stderr\": 0.030748338390153722,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.4239664190137454,\n\
\ \"mc2_stderr\": 0.014353789922903714\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035004,\n\
\ \"acc_norm\": 0.523037542662116,\n \"acc_norm_stderr\": 0.01459587320535827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5813582951603267,\n\
\ \"acc_stderr\": 0.004923281841828519,\n \"acc_norm\": 0.7763393746265684,\n\
\ \"acc_norm_stderr\": 0.004158455808204937\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.015594753632006526,\n\
\ \"mc2\": 0.4239664190137454,\n \"mc2_stderr\": 0.014353789922903714\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.7853196527229677,\n\
\ \"acc_stderr\": 0.011539912734345402\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.35860500379075055,\n \"acc_stderr\": 0.013210317364134031\n\
\ }\n}\n```"
repo_url: https://huggingface.co/augmxnt/shisa-base-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|arc:challenge|25_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|gsm8k|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hellaswag|10_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T16-05-53.719253.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- '**/details_harness|winogrande|5_2023-12-09T16-05-53.719253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T16-05-53.719253.parquet'
- config_name: results
data_files:
- split: 2023_12_09T16_05_53.719253
path:
- results_2023-12-09T16-05-53.719253.parquet
- split: latest
path:
- results_2023-12-09T16-05-53.719253.parquet
---
# Dataset Card for Evaluation run of augmxnt/shisa-base-7b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augmxnt/shisa-base-7b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [augmxnt/shisa-base-7b-v1](https://huggingface.co/augmxnt/shisa-base-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T16:05:53.719253](https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1/blob/main/results_2023-12-09T16-05-53.719253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2521543269268176,
"acc_stderr": 0.0301585291688336,
"acc_norm": 0.24535569023474865,
"acc_norm_stderr": 0.030748338390153722,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.4239664190137454,
"mc2_stderr": 0.014353789922903714
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.014599131353035004,
"acc_norm": 0.523037542662116,
"acc_norm_stderr": 0.01459587320535827
},
"harness|hellaswag|10": {
"acc": 0.5813582951603267,
"acc_stderr": 0.004923281841828519,
"acc_norm": 0.7763393746265684,
"acc_norm_stderr": 0.004158455808204937
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.4239664190137454,
"mc2_stderr": 0.014353789922903714
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345402
},
"harness|gsm8k|5": {
"acc": 0.35860500379075055,
"acc_stderr": 0.013210317364134031
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-samsum-99725515-12535673 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP10
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP10
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
heliosprime/twitter_dataset_1713224556 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 29865
num_examples: 84
download_size: 23930
dataset_size: 29865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713224556"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/beir_webis-touche2020 | ---
pretty_name: '`beir/webis-touche2020`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/webis-touche2020`
The `beir/webis-touche2020` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/webis-touche2020).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=382,545
- `queries` (i.e., topics); count=49
- `qrels`: (relevance assessments); count=2,962
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_webis-touche2020', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'title': ..., 'stance': ..., 'url': ...}
queries = load_dataset('irds/beir_webis-touche2020', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ..., 'description': ..., 'narrative': ...}
qrels = load_dataset('irds/beir_webis-touche2020', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bondarenko2020Tuche,
title={Overview of Touch{\'e} 2020: Argument Retrieval},
author={Alexander Bondarenko and Maik Fr{\"o}be and Meriem Beloucif and Lukas Gienapp and Yamen Ajjour and Alexander Panchenko and Christian Biemann and Benno Stein and Henning Wachsmuth and Martin Potthast and Matthias Hagen},
booktitle={CLEF},
year={2020}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2 | ---
pretty_name: Evaluation run of proto-llm/uniwiz-7B-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [proto-llm/uniwiz-7B-v0.2](https://huggingface.co/proto-llm/uniwiz-7B-v0.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T11:59:37.867165](https://huggingface.co/datasets/open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2/blob/main/results_2024-01-11T11-59-37.867165.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6349386109714982,\n\
\ \"acc_stderr\": 0.03234622354979693,\n \"acc_norm\": 0.6405443013835485,\n\
\ \"acc_norm_stderr\": 0.032995556354475986,\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5991296817497732,\n\
\ \"mc2_stderr\": 0.01542474423315055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946705,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104296\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6687910774746066,\n\
\ \"acc_stderr\": 0.0046968616254969234,\n \"acc_norm\": 0.8507269468233419,\n\
\ \"acc_norm_stderr\": 0.003556291232050351\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406943,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406943\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.01268590653820624,\n\
\ \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.01268590653820624\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"\
acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5991296817497732,\n\
\ \"mc2_stderr\": 0.01542474423315055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3752843062926459,\n \
\ \"acc_stderr\": 0.01333717054574293\n }\n}\n```"
repo_url: https://huggingface.co/proto-llm/uniwiz-7B-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|arc:challenge|25_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|gsm8k|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hellaswag|10_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T11-59-37.867165.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- '**/details_harness|winogrande|5_2024-01-11T11-59-37.867165.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T11-59-37.867165.parquet'
- config_name: results
data_files:
- split: 2024_01_11T11_59_37.867165
path:
- results_2024-01-11T11-59-37.867165.parquet
- split: latest
path:
- results_2024-01-11T11-59-37.867165.parquet
---
# Dataset Card for Evaluation run of proto-llm/uniwiz-7B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [proto-llm/uniwiz-7B-v0.2](https://huggingface.co/proto-llm/uniwiz-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T11:59:37.867165](https://huggingface.co/datasets/open-llm-leaderboard/details_proto-llm__uniwiz-7B-v0.2/blob/main/results_2024-01-11T11-59-37.867165.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6349386109714982,
"acc_stderr": 0.03234622354979693,
"acc_norm": 0.6405443013835485,
"acc_norm_stderr": 0.032995556354475986,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5991296817497732,
"mc2_stderr": 0.01542474423315055
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946705,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104296
},
"harness|hellaswag|10": {
"acc": 0.6687910774746066,
"acc_stderr": 0.0046968616254969234,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.003556291232050351
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406943,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757433,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.01268590653820624,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.01268590653820624
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5991296817497732,
"mc2_stderr": 0.01542474423315055
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.3752843062926459,
"acc_stderr": 0.01333717054574293
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dhivyeshrk/Disease-Symptom-Extensive-Clean | ---
license: mit
---
|
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.1_seed_3_t_1.0_eval | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
- name: gen_proxy_reward
dtype: float64
- name: gen_gold_reward
dtype: float64
splits:
- name: epoch_0
num_bytes: 44049617
num_examples: 18928
- name: epoch_1
num_bytes: 44583814
num_examples: 18928
- name: epoch_10
num_bytes: 44732277
num_examples: 18928
- name: epoch_11
num_bytes: 44732198
num_examples: 18928
- name: epoch_12
num_bytes: 44733777
num_examples: 18928
- name: epoch_13
num_bytes: 44734635
num_examples: 18928
- name: epoch_14
num_bytes: 44734245
num_examples: 18928
- name: epoch_15
num_bytes: 44733656
num_examples: 18928
- name: epoch_16
num_bytes: 44733993
num_examples: 18928
- name: epoch_17
num_bytes: 44732843
num_examples: 18928
- name: epoch_18
num_bytes: 44734079
num_examples: 18928
- name: epoch_19
num_bytes: 44733062
num_examples: 18928
- name: epoch_2
num_bytes: 44666660
num_examples: 18928
- name: epoch_20
num_bytes: 44735024
num_examples: 18928
- name: epoch_21
num_bytes: 44734628
num_examples: 18928
- name: epoch_22
num_bytes: 44734949
num_examples: 18928
- name: epoch_23
num_bytes: 44735187
num_examples: 18928
- name: epoch_24
num_bytes: 44734333
num_examples: 18928
- name: epoch_25
num_bytes: 44733957
num_examples: 18928
- name: epoch_26
num_bytes: 44735261
num_examples: 18928
- name: epoch_27
num_bytes: 44736109
num_examples: 18928
- name: epoch_28
num_bytes: 44734562
num_examples: 18928
- name: epoch_29
num_bytes: 44734333
num_examples: 18928
- name: epoch_3
num_bytes: 44715690
num_examples: 18928
- name: epoch_4
num_bytes: 44737495
num_examples: 18928
- name: epoch_5
num_bytes: 44740546
num_examples: 18928
- name: epoch_6
num_bytes: 44740269
num_examples: 18928
- name: epoch_7
num_bytes: 44736127
num_examples: 18928
- name: epoch_8
num_bytes: 44735179
num_examples: 18928
- name: epoch_9
num_bytes: 44732950
num_examples: 18928
download_size: 709901766
dataset_size: 1341121455
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
# Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.1_seed_3_t_1.0_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NoIdeaLand__test-3k-mx | ---
pretty_name: Evaluation run of NoIdeaLand/test-3k-mx
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-3k-mx\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25937377998421673,\n\
\ \"acc_stderr\": 0.03158826848264918,\n \"acc_norm\": 0.263032034220802,\n\
\ \"acc_norm_stderr\": 0.031588822884227444,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n\
\ \"mc2_stderr\": 0.014339231042407396\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.013880644570156201,\n\
\ \"acc_norm\": 0.38054607508532423,\n \"acc_norm_stderr\": 0.014188277712349822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48516231826329415,\n\
\ \"acc_stderr\": 0.004987583858923224,\n \"acc_norm\": 0.6643098984266083,\n\
\ \"acc_norm_stderr\": 0.004712660409846823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.034065420585026505,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.034065420585026505\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241235,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730445,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730445\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.035058596825972656,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.035058596825972656\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21428571428571427,\n \"acc_stderr\": 0.02113285918275445,\n \"\
acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02113285918275445\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2032258064516129,\n \"acc_stderr\": 0.022891687984554952,\n \"\
acc_norm\": 0.2032258064516129,\n \"acc_norm_stderr\": 0.022891687984554952\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941065,\n \"\
acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941065\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28717948717948716,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.28717948717948716,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1981651376146789,\n \"acc_stderr\": 0.017090573804217885,\n \"\
acc_norm\": 0.1981651376146789,\n \"acc_norm_stderr\": 0.017090573804217885\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.02569534164382467,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.02569534164382467\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n\
\ \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n\
\ \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.01569600856380709,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.01569600856380709\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044273,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044273\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398222,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398222\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2770534550195567,\n\
\ \"acc_stderr\": 0.011430462443719676,\n \"acc_norm\": 0.2770534550195567,\n\
\ \"acc_norm_stderr\": 0.011430462443719676\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n\
\ \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.4093188700877857,\n\
\ \"mc2_stderr\": 0.014339231042407396\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-3k-mx
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T02-20-18.679270.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T02-20-18.679270.parquet'
- config_name: results
data_files:
- split: 2023_09_22T02_20_18.679270
path:
- results_2023-09-22T02-20-18.679270.parquet
- split: latest
path:
- results_2023-09-22T02-20-18.679270.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-3k-mx
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-3k-mx
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-3k-mx](https://huggingface.co/NoIdeaLand/test-3k-mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-3k-mx",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T02:20:18.679270](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-3k-mx/blob/main/results_2023-09-22T02-20-18.679270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25937377998421673,
"acc_stderr": 0.03158826848264918,
"acc_norm": 0.263032034220802,
"acc_norm_stderr": 0.031588822884227444,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.4093188700877857,
"mc2_stderr": 0.014339231042407396
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.013880644570156201,
"acc_norm": 0.38054607508532423,
"acc_norm_stderr": 0.014188277712349822
},
"harness|hellaswag|10": {
"acc": 0.48516231826329415,
"acc_stderr": 0.004987583858923224,
"acc_norm": 0.6643098984266083,
"acc_norm_stderr": 0.004712660409846823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.034065420585026505,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.034065420585026505
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241235,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730445,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730445
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.035058596825972656,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.035058596825972656
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02113285918275445,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02113285918275445
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941065,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941065
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.031821550509166484,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.031821550509166484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28717948717948716,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.28717948717948716,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1981651376146789,
"acc_stderr": 0.017090573804217885,
"acc_norm": 0.1981651376146789,
"acc_norm_stderr": 0.017090573804217885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.02569534164382467,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.02569534164382467
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.01569600856380709,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.01569600856380709
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044273,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961455,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398222,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398222
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2770534550195567,
"acc_stderr": 0.011430462443719676,
"acc_norm": 0.2770534550195567,
"acc_norm_stderr": 0.011430462443719676
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541104,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.4093188700877857,
"mc2_stderr": 0.014339231042407396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NovusResearch/gsm8k-Translated-TR | ---
license: mit
---
|
nateraw/spaces-monitoring | ---
license: mit
---
|
pkarypis/ultrachat_filtered_0.75 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: test_gen
num_bytes: 148276089
num_examples: 28304
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: train_sft
num_bytes: 1047788874.7576168
num_examples: 155898
download_size: 1410287614
dataset_size: 2698157434.757617
---
# Dataset Card for "ultrachat_filtered_0.75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-p8-final | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 62381533
num_examples: 13674
download_size: 22154217
dataset_size: 62381533
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bikram22pi7/Thiruvalluvar_Thirukkural | ---
license: apache-2.0
task_categories:
- text-generation
- text-classification
language:
- ta
pretty_name: d
size_categories:
- 1K<n<10K
--- |
jelber2/RustBioGPT | ---
license: mit
---
```sh
git clone https://github.com/natir/br.git
git clone https://github.com/natir/pcon
git clone https://github.com/natir/yacrd
git clone https://github.com/natir/rasusa
git clone https://github.com/natir/fpa
git clone https://github.com/natir/kmrf
rm -f RustBioGPT-train.csv && for i in `find . -name "*.rs"`;do paste -d "," <(echo $i|perl -pe "s/\.\/(\w+)\/.+/\"\1\"/g") <(echo $i|perl -pe "s/(.+)/\"\1\"/g") <(perl -pe "s/\n/\\\n/g" $i|perl -pe s"/\"/\'/g" |perl -pe "s/(.+)/\"\1\"/g") <(echo "mit"|perl -pe "s/(.+)/\"\1\"/g") >> RustBioGPT-train.csv; done
sed -i '1i "repo_name","path","content","license"' RustBioGPT-train.csv
``` |
liuyanchen1015/MULTI_VALUE_wnli_flat_adj_for_adv | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 203
num_examples: 1
- name: test
num_bytes: 1047
num_examples: 4
- name: train
num_bytes: 1705
num_examples: 6
download_size: 10769
dataset_size: 2955
---
# Dataset Card for "MULTI_VALUE_wnli_flat_adj_for_adv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lansinuote/diffusion.8.instruct_pix2pix | ---
dataset_info:
features:
- name: input
dtype: image
- name: text
dtype: string
- name: output
dtype: image
splits:
- name: train
num_bytes: 416880509.0
num_examples: 1000
download_size: 416898966
dataset_size: 416880509.0
---
# Dataset Card for "diffusion.8.instruct_pix2pix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/combined_triples_with_margins | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: source
dtype: string
- name: qp_sim
dtype: float32
- name: qn_sim
dtype: float32
- name: pn_sim
dtype: float32
splits:
- name: train
num_bytes: 1589466086
num_examples: 1562437
download_size: 965469871
dataset_size: 1589466086
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ivrit-ai/audio-transcripts | ---
language:
- he
license: other
size_categories:
- 1M<n<10M
task_categories:
- audio-classification
- voice-activity-detection
extra_gated_prompt: 'You agree to the following license terms:
This material and data is licensed under the terms of the Creative Commons Attribution
4.0 International License (CC BY 4.0), The full text of the CC-BY 4.0 license is
available at https://creativecommons.org/licenses/by/4.0/.
Notwithstanding the foregoing, this material and data may only be used, modified
and distributed for the express purpose of training AI models, and subject to the
foregoing restriction. In addition, this material and data may not be used in order
to create audiovisual material that simulates the voice or likeness of the specific
individuals appearing or speaking in such materials and data (a “deep-fake”). To
the extent this paragraph is inconsistent with the CC-BY-4.0 license, the terms
of this paragraph shall govern.
By downloading or using any of this material or data, you agree that the Project
makes no representations or warranties in respect of the data, and shall have no
liability in respect thereof. These disclaimers and limitations are in addition
to any disclaimers and limitations set forth in the CC-BY-4.0 license itself. You
understand that the project is only able to make available the materials and data
pursuant to these disclaimers and limitations, and without such disclaimers and
limitations the project would not be able to make available the materials and data
for your use.'
extra_gated_fields:
I have read the license, and agree to its terms: checkbox
dataset_info:
features:
- name: source
dtype: string
- name: episode
dtype: string
- name: uuid
dtype: string
- name: text
dtype: string
- name: attrs
struct:
- name: segments
list:
- name: avg_logprob
dtype: float64
- name: compression_ratio
dtype: float64
- name: end
dtype: float64
- name: id
dtype: int64
- name: no_speech_prob
dtype: float64
- name: seek
dtype: int64
- name: start
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1290457176
num_examples: 2183042
download_size: 421521923
dataset_size: 1290457176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
ivrit.ai is a database of Hebrew audio and text content.
**audio-base** contains the raw, unprocessed sources.
**audio-vad** contains audio snippets generated by applying Silero VAD (https://github.com/snakers4/silero-vad) to the base dataset.
**audio-transcripts** contains transcriptions for each snippet in the audio-vad dataset.
The audio-base dataset contains data from the following sources:
* Geekonomy (Podcast, https://geekonomy.net)
* HaCongress (Podcast, https://hacongress.podbean.com/)
* Idan Eretz's YouTube channel (https://www.youtube.com/@IdanEretz)
* Moneytime (Podcast, https://money-time.co.il)
* Mor'e Nevohim (Podcast, https://open.spotify.com/show/1TZeexEk7n60LT1SlS2FE2?si=937266e631064a3c)
* Yozevitch's World (Podcast, https://www.yozevitch.com/yozevitch-podcast)
* NETfrix (Podcast, https://netfrix.podbean.com)
* On Meaning (Podcast, https://mashmaut.buzzsprout.com)
* Shnekel (Podcast, https://www.shnekel.live)
* Bite-sized History (Podcast, https://soundcloud.com/historia-il)
* Tziun 3 (Podcast, https://tziun3.co.il)
* Academia Israel (https://www.youtube.com/@academiaisrael6115)
* Shiluv Maagal (https://www.youtube.com/@ShiluvMaagal)
Paper: https://arxiv.org/abs/2307.08720
If you use our datasets, the following quote is preferable:
```
@misc{marmor2023ivritai,
title={ivrit.ai: A Comprehensive Dataset of Hebrew Speech for AI Research and Development},
author={Yanir Marmor and Kinneret Misgav and Yair Lifshitz},
year={2023},
eprint={2307.08720},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
``` |
mayankparkar/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cellophaneZR/AE | ---
license: cc-by-nc-4.0
---
|
emirsoyturk/ethereum-vulnerability-dataset | ---
license: mit
---
|
SlookUP/ChatLawAll | ---
license: openrail
---
|
SocialGrep/reddit-crypto-aug-2021 | ---
annotations_creators:
- lexyr
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
paperswithcode_id: null
---
# Dataset Card for reddit-crypto-aug-2021
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://socialgrep.com/datasets](https://socialgrep.com/datasets?utm_source=huggingface&utm_medium=link&utm_campaign=dataset&utm_term=crypto)
- **Point of Contact:** [Website](https://socialgrep.com/contact?utm_source=huggingface&utm_medium=link&utm_campaign=dataset&utm_term=crypto)
### Dataset Summary
This corpus contains the complete data for the activity on the following subreddits for the entire month of August 2021:
- /r/cryptocurrency
- /r/cryptocurrencyclassic
- /r/cryptocurrencyico
- /r/cryptomars
- /r/cryptomoon
- /r/cryptomoonshots
- /r/satoshistreetbets
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A data point is a post or a comment. Due to the separate nature of the two, those exist in two different files - even though many fields are shared.
### Data Fields
- 'type': the type of the data point. Can be 'post' or 'comment'.
- 'id': the base-36 Reddit ID of the data point. Unique when combined with type.
- 'subreddit.id': the base-36 Reddit ID of the data point's host subreddit. Unique.
- 'subreddit.name': the human-readable name of the data point's host subreddit.
- 'subreddit.nsfw': a boolean marking the data point's host subreddit as NSFW or not.
- 'created_utc': a UTC timestamp for the data point.
- 'permalink': a reference link to the data point on Reddit.
- 'score': score of the data point on Reddit.
- 'domain': (Post only) the domain of the data point's link.
- 'url': (Post only) the destination of the data point's link, if any.
- 'selftext': (Post only) the self-text of the data point, if any.
- 'title': (Post only) the title of the post data point.
- 'body': (Comment only) the body of the comment data point.
- 'sentiment': (Comment only) the result of an in-house sentiment analysis pipeline. Used for exploratory analysis.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
CC-BY v4.0
### Contributions
[Needs More Information] |
JLD/unsplash25k-image-embeddings | ---
license: mit
---
|
Kauasido/PACKMCDALESTE | ---
license: openrail
---
|
mask-distilled-one-sec-cv12/chunk_228 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1176735740
num_examples: 231095
download_size: 1197667160
dataset_size: 1176735740
---
# Dataset Card for "chunk_228"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rocinante/tulu_merge | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: data_source
dtype: string
- name: history
sequence:
sequence: string
splits:
- name: train
num_bytes: 306750727
num_examples: 203886
download_size: 174953486
dataset_size: 306750727
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tulu_merge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-51000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1113308
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Morteza-Shahrabi-Farahani/Detecting-toxic-comments | ---
license: mit
---
|
minoosh/shEMO_nosplits | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: emotion
dtype:
class_label:
names:
'0': A
'1': H
'2': N
'3': S
'4': W
'5': F
splits:
- name: train
num_bytes: 1063025462.0
num_examples: 3000
download_size: 1043899084
dataset_size: 1063025462.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shEMO_nosplits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
detectors/lsun_r-ood | ---
license: unknown
size_categories: 10K<n<100K
task_categories:
- image-classification
paperswithcode_id: lsun
pretty_name: LSUN (r)
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 27566116.0
num_examples: 10000
download_size: 0
dataset_size: 27566116.0
---
# Dataset Card for LSUN (r) for OOD Detection
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Original Dataset Authors**: Limin Wang, Sheng Guo, Weilin Huang, Yuanjun Xiong, Yu Qiao
- **OOD Split Authors:** Shiyu Liang, Yixuan Li, R. Srikant
- **Shared by:** Eduardo Dadalto
- **License:** unknown
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Original Dataset Paper:** http://arxiv.org/abs/1610.01119v2
- **First OOD Application Paper:** http://arxiv.org/abs/1706.02690v5
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset is intended to be used as an ouf-of-distribution dataset for image classification benchmarks.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not annotated.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The goal in curating and sharing this dataset to the HuggingFace Hub is to accelerate research and promote reproducibility in generalized Out-of-Distribution (OOD) detection.
Check the python library [detectors](https://github.com/edadaltocg/detectors) if you are interested in OOD detection.
### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
Please check original paper for details on the dataset.
### Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Please check original paper for details on the dataset.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@software{detectors2023,
author = {Eduardo Dadalto},
title = {Detectors: a Python Library for Generalized Out-Of-Distribution Detection},
url = {https://github.com/edadaltocg/detectors},
doi = {https://doi.org/10.5281/zenodo.7883596},
month = {5},
year = {2023}
}
@article{1706.02690v5,
author = {Shiyu Liang and Yixuan Li and R. Srikant},
title = {Enhancing The Reliability of Out-of-distribution Image Detection in
Neural Networks},
year = {2017},
month = {6},
note = {ICLR 2018},
archiveprefix = {arXiv},
url = {http://arxiv.org/abs/1706.02690v5}
}
@article{1610.01119v2,
author = {Limin Wang and Sheng Guo and Weilin Huang and Yuanjun Xiong and Yu Qiao},
title = {Knowledge Guided Disambiguation for Large-Scale Scene Classification
with Multi-Resolution CNNs},
year = {2016},
month = {10},
note = {To appear in IEEE Transactions on Image Processing. Code and models
are available at https://github.com/wanglimin/MRCNN-Scene-Recognition},
archiveprefix = {arXiv},
url = {http://arxiv.org/abs/1610.01119v2}
}
```
## Dataset Card Authors
Eduardo Dadalto
## Dataset Card Contact
https://huggingface.co/edadaltocg |
KunalEsM/combine_data | ---
license: apache-2.0
---
|
bn22/dolphincoder-25k-mini | ---
license: apache-2.0
---
|
AdapterOcean/code_instructions_standardized_cluster_7_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 21709928
num_examples: 17300
download_size: 10819179
dataset_size: 21709928
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_7_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/prompt_dataset_v2_hermes | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: system
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 8342523
num_examples: 1960
download_size: 2041735
dataset_size: 8342523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hrtdind/saturn-selfplay | ---
license: mit
---
|
cestwc/concise536 | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: cite
dtype: string
- name: wordy
dtype: string
- name: concise
sequence: string
- name: category
dtype: string
- name: link
dtype: string
- name: delete
dtype:
class_label:
names:
'0': not required
'1': required
- name: replace
dtype:
class_label:
names:
'0': not required
'1': required
- name: rewrite
dtype:
class_label:
names:
'0': not required
'1': required
splits:
- name: validation
num_bytes: 3692
num_examples: 14
- name: test
num_bytes: 161635
num_examples: 536
download_size: 79866
dataset_size: 165327
---
# Dataset Card for "concise536"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_L-R__LLmRa-1.3B | ---
pretty_name: Evaluation run of L-R/LLmRa-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [L-R/LLmRa-1.3B](https://huggingface.co/L-R/LLmRa-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_L-R__LLmRa-1.3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T07:53:28.582746](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B/blob/main/results_2023-10-23T07-53-28.582746.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.015310402684563759,\n\
\ \"em_stderr\": 0.0012574265699578076,\n \"f1\": 0.07718435402684563,\n\
\ \"f1_stderr\": 0.0019076539585540348,\n \"acc\": 0.29556455256278075,\n\
\ \"acc_stderr\": 0.0072895996116890014\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.015310402684563759,\n \"em_stderr\": 0.0012574265699578076,\n\
\ \"f1\": 0.07718435402684563,\n \"f1_stderr\": 0.0019076539585540348\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225267\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.590370955011839,\n \"acc_stderr\": 0.013821049109655476\n\
\ }\n}\n```"
repo_url: https://huggingface.co/L-R/LLmRa-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T07_53_28.582746
path:
- '**/details_harness|drop|3_2023-10-23T07-53-28.582746.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T07-53-28.582746.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T07_53_28.582746
path:
- '**/details_harness|gsm8k|5_2023-10-23T07-53-28.582746.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T07-53-28.582746.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-12-55.866010.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T00-12-55.866010.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T07_53_28.582746
path:
- '**/details_harness|winogrande|5_2023-10-23T07-53-28.582746.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T07-53-28.582746.parquet'
- config_name: results
data_files:
- split: 2023_10_04T00_12_55.866010
path:
- results_2023-10-04T00-12-55.866010.parquet
- split: 2023_10_23T07_53_28.582746
path:
- results_2023-10-23T07-53-28.582746.parquet
- split: latest
path:
- results_2023-10-23T07-53-28.582746.parquet
---
# Dataset Card for Evaluation run of L-R/LLmRa-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/L-R/LLmRa-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [L-R/LLmRa-1.3B](https://huggingface.co/L-R/LLmRa-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_L-R__LLmRa-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T07:53:28.582746](https://huggingface.co/datasets/open-llm-leaderboard/details_L-R__LLmRa-1.3B/blob/main/results_2023-10-23T07-53-28.582746.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.015310402684563759,
"em_stderr": 0.0012574265699578076,
"f1": 0.07718435402684563,
"f1_stderr": 0.0019076539585540348,
"acc": 0.29556455256278075,
"acc_stderr": 0.0072895996116890014
},
"harness|drop|3": {
"em": 0.015310402684563759,
"em_stderr": 0.0012574265699578076,
"f1": 0.07718435402684563,
"f1_stderr": 0.0019076539585540348
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225267
},
"harness|winogrande|5": {
"acc": 0.590370955011839,
"acc_stderr": 0.013821049109655476
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
channotte/Georges_Sand | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2455525
num_examples: 42086
- name: test
num_bytes: 258701
num_examples: 5286
download_size: 1777131
dataset_size: 2714226
---
# Dataset Card for "Georges_Sand"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_beberik__Nyxene-11B | ---
pretty_name: Evaluation run of beberik/Nyxene-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beberik/Nyxene-11B](https://huggingface.co/beberik/Nyxene-11B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T17:29:47.826048](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-11B/blob/main/results_2023-12-04T17-29-47.826048.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651058700269482,\n\
\ \"acc_stderr\": 0.0320152029210293,\n \"acc_norm\": 0.6547280296636943,\n\
\ \"acc_norm_stderr\": 0.03264773034799592,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5749990941717074,\n\
\ \"mc2_stderr\": 0.015569738564249067\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620199,\n\
\ \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.013592431519068075\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6625174268074089,\n\
\ \"acc_stderr\": 0.004718846448021786,\n \"acc_norm\": 0.8454491137223661,\n\
\ \"acc_norm_stderr\": 0.0036073726062951024\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544074,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544074\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678492,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678492\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n\
\ \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740546,\n \"\
acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740546\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5749990941717074,\n\
\ \"mc2_stderr\": 0.015569738564249067\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881575\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5178165276724791,\n \
\ \"acc_stderr\": 0.01376373837986793\n }\n}\n```"
repo_url: https://huggingface.co/beberik/Nyxene-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T17-29-47.826048.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- '**/details_harness|winogrande|5_2023-12-04T17-29-47.826048.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T17-29-47.826048.parquet'
- config_name: results
data_files:
- split: 2023_12_04T17_29_47.826048
path:
- results_2023-12-04T17-29-47.826048.parquet
- split: latest
path:
- results_2023-12-04T17-29-47.826048.parquet
---
# Dataset Card for Evaluation run of beberik/Nyxene-11B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beberik/Nyxene-11B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beberik/Nyxene-11B](https://huggingface.co/beberik/Nyxene-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T17:29:47.826048](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-11B/blob/main/results_2023-12-04T17-29-47.826048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651058700269482,
"acc_stderr": 0.0320152029210293,
"acc_norm": 0.6547280296636943,
"acc_norm_stderr": 0.03264773034799592,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5749990941717074,
"mc2_stderr": 0.015569738564249067
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.013592431519068075
},
"harness|hellaswag|10": {
"acc": 0.6625174268074089,
"acc_stderr": 0.004718846448021786,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062951024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544074,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544074
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305528,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305528
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678492,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678492
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740546,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740546
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5749990941717074,
"mc2_stderr": 0.015569738564249067
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881575
},
"harness|gsm8k|5": {
"acc": 0.5178165276724791,
"acc_stderr": 0.01376373837986793
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-public_relations-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 9150
num_examples: 17
download_size: 13091
dataset_size: 9150
---
# Dataset Card for "mmlu-public_relations-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indolem/IndoMMLU | ---
license: mit
task_categories:
- question-answering
language:
- id
tags:
- knowledge
pretty_name: IndoMMLU
size_categories:
- 10K<n<100K
---
# IndoMMLU
<!---
[](https://github.com/internLM/OpenCompass/) [](https://github.com/EleutherAI/lm-evaluation-harness)
-->
<p align="center"> <img src="https://raw.githubusercontent.com/fajri91/eval_picts/master/IndoMMLU-Bar.png" style="width: 100%;" id="title-icon">
</p>
<p align="center"> <a href="http://www.fajrikoto.com" target="_blank">Fajri Koto</a>, <a href="https://www.linkedin.com/in/nuaisyah/" target="_blank">Nurul Aisyah</a>, <a href="https://haonan-li.github.io/" target="_blank">Haonan Li</a>, <a href="https://people.eng.unimelb.edu.au/tbaldwin/" target="_blank">Timothy Baldwin</a> </p>
<h4 align="center">
<p align="center" style="display: flex; flex-direction: row; justify-content: center; align-items: center">
📄 <a href="https://arxiv.org/abs/2310.04928" target="_blank" style="margin-right: 15px; margin-left: 10px">Paper</a> •
🏆 <a href="https://github.com/fajri91/IndoMMLU/blob/main/README_EN.md#evaluation" target="_blank" style="margin-left: 10px">Leaderboard</a> •
🤗 <a href="https://huggingface.co/datasets/indolem/indommlu" target="_blank" style="margin-left: 10px">Dataset</a>
</p>
</h4>
## Introduction
We introduce IndoMMLU, the first multi-task language understanding benchmark for Indonesian culture and languages,
which consists of questions from primary school to university entrance exams in Indonesia. By employing professional teachers,
we obtain 14,906 questions across 63 tasks and education levels, with 46\% of the questions focusing on assessing proficiency
in the Indonesian language and knowledge of nine local languages and cultures in Indonesia.
<p align="left"> <img src="https://github.com/fajri91/eval_picts/blob/master/IndoMMLU-dist.png?raw=true" style="width: 500px;" id="title-icon"> </p>
## Subjects
| Level | Subjects |
|-----------|------------------------------------|
| SD (Primary School) | Science, Social science, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Dayak Ngaju, Minangkabau culture, Art, Sports, Islam religion, Christian religion, Hindu religion |
| SMP (Junior High School) | Science, Social science, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Minangkabau culture, Art, Sports, Islam religion, Christian religion, Hindu religion |
| SMA (Senior High School) | Physics, Chemistry, Biology, Geography, Sociology, Economics, History, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Art, Sports, Islam religion, Christian religion, Hindu religion |
University Entrance Test | Chemistry, Biology, Geography, Sociology, Economics, History, Indonesian Language |
We categorize the collected questions into different subject areas, including: (1) STEM (Science, Technology, Engineering, and Mathematics); (2) Social Science; (3) Humanities; (4) Indonesian Language; and (5) Local Languages and Cultures.
## Examples
These questions are written in Indonesian. For local language subjects, some are written in the local languages. The English version is for illustrative purposes only.
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/min_example.png?raw=true" style="width: 400px;" id="title-icon">
</p>
## Evaluation
We evaluate 24 multilingual LLMs of different sizes in zero-shot and few-shot settings. This includes [GPT-3.5 (ChatGPT)](https://chat.openai.com/), [XGLM](https://arxiv.org/abs/2112.10668), [Falcon](https://falconllm.tii.ae/), [BLOOMZ](https://huggingface.co/bigscience/bloomz), [mT0](https://huggingface.co/bigscience/bloomz), [LLaMA](https://arxiv.org/abs/2302.13971), and [Bactrian-X](https://github.com/mbzuai-nlp/bactrian-x). Prior to the question and multiple-choice options, we add a simple prompt in the Indonesian language:
```
Ini adalah soal [subject] untuk [level]. Pilihlah salah satu jawaban yang dianggap benar!
English Translation: This is a [subject] question for [level]. Please choose the correct answer!
```
#### Zero-shot Evaluation
| Model (#param) | STEM | Social Science | Humanities | Indonesian Lang. | Local L. Culture | Average |
|---------------------|------|----------|-------------|---------|----------|---------|
| Random | 21.9 | 23.4 | 23.5 | 24.4 | 26.6 | 24.4 |
| [GPT-3.5 (175B)](https://chat.openai.com/) | **54.3** | **62.5** | **64.0** | **62.2** | 39.3 | **53.2** |
| [XGLM (564M)](https://huggingface.co/facebook/xglm-564M) | 22.1 | 23.0 | 25.6 | 25.6 | 27.5 | 25.2 |
| [XGLM (1.7B)](https://huggingface.co/facebook/xglm-1.7B) | 20.9 | 23.0 | 24.6 | 24.8 | 26.6 | 24.4 |
| [XGLM (2.9B)](https://huggingface.co/facebook/xglm-2.9B) | 22.9 | 23.2 | 25.4 | 26.3 | 27.2 | 25.2 |
| [XGLM (4.5B)](https://huggingface.co/facebook/xglm-4.5B) | 21.8 | 23.1 | 25.6 | 25.8 | 27.1 | 25.0 |
| [XGLM (7.5B)](https://huggingface.co/facebook/xglm-7.5B) | 22.7 | 21.7 | 23.6 | 24.5 | 27.5 | 24.5 |
| [Falcon (7B)](https://huggingface.co/tiiuae/falcon-7b) | 22.1 | 22.9 | 25.5 | 25.7 | 27.5 | 25.1 |
| [Falcon (40B)](https://huggingface.co/tiiuae/falcon-40b) | 30.2 | 34.8 | 34.8 | 34.9 | 29.2 | 32.1 |
| [BLOOMZ (560M)](https://huggingface.co/bigscience/bloomz-560m) | 22.9 | 23.6 | 23.2 | 24.2 | 25.1 | 24.0 |
| [BLOOMZ (1.1B)](https://huggingface.co/bigscience/bloomz-1b1) | 20.4 | 21.4 | 21.1 | 23.5 | 24.7 | 22.4 |
| [BLOOMZ (1.7B)](https://huggingface.co/bigscience/bloomz-1b7) | 31.5 | 39.3 | 38.3 | 42.8 | 29.4 | 34.4 |
| [BLOOMZ (3B)](https://huggingface.co/bigscience/bloomz-3b) | 33.5 | 44.5 | 39.7 | 46.7 | 29.8 | 36.4 |
| [BLOOMZ (7.1B)](https://huggingface.co/bigscience/bloomz-7b1) | 37.1 | 46.7 | 44.0 | 49.1 | 28.2 | 38.0 |
| [mT0<sub>small</sub> (300M)](https://huggingface.co/bigscience/mt0-small) | 21.8 | 21.4 | 25.7 | 25.1 | 27.6 | 24.9 |
| [mT0<sub>base</sub> (580M)](https://huggingface.co/bigscience/mt0-base) | 22.6 | 22.6 | 25.7 | 25.6 | 26.9 | 25.0 |
| [mT0<sub>large</sub> (1.2B)](https://huggingface.co/bigscience/mt0-large) | 22.0 | 23.4 | 25.1 | 27.3 | 27.6 | 25.2 |
| [mT0<sub>xl</sub> (3.7B)](https://huggingface.co/bigscience/mt0-xl) | 31.4 | 42.9 | 41.0 | 47.8 | 35.7 | 38.2 |
| [mT0<sub>xxl</sub> (13B)](https://huggingface.co/bigscience/mt0-xxl) | 33.5 | 46.2 | 47.9 | 52.6 | **39.6** | 42.5 |
| [LLaMA (7B)](https://arxiv.org/abs/2302.13971) | 22.8 | 23.1 | 25.1 | 26.7 | 27.6 | 25.3 |
| [LLaMA (13B)](https://arxiv.org/abs/2302.13971) | 24.1 | 23.0 | 24.4 | 29.5 | 26.7 | 25.3 |
| [LLaMA (30B)](https://arxiv.org/abs/2302.13971) | 25.4 | 23.5 | 25.9 | 28.4 | 28.7 | 26.5 |
| [LLaMA (65B)](https://arxiv.org/abs/2302.13971) | 33.0 | 37.7 | 40.8 | 41.4 | 32.1 | 35.8 |
| [Bactrian-X-LLaMA (7B)](https://github.com/mbzuai-nlp/bactrian-x) | 23.3 | 24.0 | 26.0 | 26.1 | 27.5 | 25.7 |
| [Bactrian-X-LLaMA (13B)](https://github.com/mbzuai-nlp/bactrian-x) | 28.3 | 29.9 | 32.8 | 35.2 | 29.2 | 30.3 |
#### GPT-3.5 performance (% accuracy) across different education levels
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/IndoMMLU-result.png?raw=true" style="width: 370px;" id="title-icon">
</p>
Red indicates that the score is below the minimum passing threshold of 65, while green signifies a score at or above this minimum. We can observe that ChatGPT mostly passes a score of 65 in Indonesian primary school exams.
#### Few-shot Evaluation
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/plot_fewshot.png?raw=true" style="width: 380px;" id="title-icon">
</p>
## Data
Each question in the dataset is a multiple-choice question with up to 5 choices and only one choice as the correct answer.
We provide our dataset according to each subject in [data](data) folder. You can also access our dataset via [Hugging Face](https://huggingface.co/datasets/indolem/indommlu).
<!--
#### Quick Use
Our dataset has been added to [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) and [OpenCompass](https://github.com/InternLM/opencompass), you can evaluate your model via these open-source tools.
-->
#### Evaluation
The code for the evaluation of each model we used is in `evaluate.py`, and the code to run them is listed in `run.sh`.
## Citation
```
@inproceedings{koto-etal-2023-indommlu,
title = "Large Language Models Only Pass Primary School Exams in {I}ndonesia: A Comprehensive Test on {I}ndo{MMLU}",
author = "Fajri Koto and Nurul Aisyah and Haonan Li and Timothy Baldwin",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = December,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
```
## License
The IndoMMLU dataset is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](http://creativecommons.org/licenses/by-nc-sa/4.0/). |
dennlinger/klexikon | ---
annotations_creators:
- found
- expert-generated
language_creators:
- found
- machine-generated
language:
- de
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- summarization
- text2text-generation
task_ids:
- text-simplification
paperswithcode_id: klexikon
pretty_name: Klexikon
tags:
- conditional-text-generation
- simplification
- document-level
---
# Dataset Card for the Klexikon Dataset
## Table of Contents
- [Version History](#version-history)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Version History
- **v0.3** (2022-09-01): Removing some five samples from the dataset due to duplication conflicts with other samples.
- **v0.2** (2022-02-28): Updated the files to no longer contain empty sections and removing otherwise empty lines at the end of files. Also removing lines with some sort of coordinate.
- **v0.1** (2022-01-19): Initial data release on Huggingface datasets.
## Dataset Description
- **Homepage:** [N/A]
- **Repository:** [Klexikon repository](https://github.com/dennlinger/klexikon)
- **Paper:** [Klexikon: A German Dataset for Joint Summarization and Simplification](https://arxiv.org/abs/2201.07198)
- **Leaderboard:** [N/A]
- **Point of Contact:** [Dennis Aumiller](mailto:dennis.aumiller@gmail.com)
### Dataset Summary
The Klexikon dataset is a German resource of document-aligned texts between German Wikipedia and the children's lexicon "Klexikon". The dataset was created for the purpose of joint text simplification and summarization, and contains almost 2900 aligned article pairs.
Notably, the children's articles use a simpler language than the original Wikipedia articles; this is in addition to a clear length discrepancy between the source (Wikipedia) and target (Klexikon) domain.
### Supported Tasks and Leaderboards
- `summarization`: The dataset can be used to train a model for summarization. In particular, it poses a harder challenge than some of the commonly used datasets (CNN/DailyMail), which tend to suffer from positional biases in the source text. This makes it very easy to generate high (ROUGE) scoring solutions, by simply taking the leading 3 sentences. Our dataset provides a more challenging extraction task, combined with the additional difficulty of finding lexically appropriate simplifications.
- `simplification`: While not currently supported by the HF task board, text simplification is concerned with the appropriate representation of a text for disadvantaged readers (e.g., children, language learners, dyslexic,...).
For scoring, we ran preliminary experiments based on [ROUGE](https://huggingface.co/metrics/rouge), however, we want to cautiously point out that ROUGE is incapable of accurately depicting simplification appropriateness.
We combined this with looking at Flesch readability scores, as implemented by [textstat](https://github.com/shivam5992/textstat).
Note that simplification metrics such as [SARI](https://huggingface.co/metrics/sari) are not applicable here, since they require sentence alignments, which we do not provide.
### Languages
The associated BCP-47 code is `de-DE`.
The text of the articles is in German. Klexikon articles are further undergoing a simple form of peer-review before publication, and aim to simplify language for 8-13 year old children. This means that the general expected text difficulty for Klexikon articles is lower than Wikipedia's entries.
## Dataset Structure
### Data Instances
One datapoint represents the Wikipedia text (`wiki_text`), as well as the Klexikon text (`klexikon_text`).
Sentences are separated by newlines for both datasets, and section headings are indicated by leading `==` (or `===` for subheadings, `====` for sub-subheading, etc.).
Further, it includes the `wiki_url` and `klexikon_url`, pointing to the respective source texts. Note that the original articles were extracted in April 2021, so re-crawling the texts yourself will likely change some content.
Lastly, we include a unique identifier `u_id` as well as the page title `title` of the Klexikon page.
Sample (abridged texts for clarity):
```
{
"u_id": 0,
"title": "ABBA",
"wiki_url": "https://de.wikipedia.org/wiki/ABBA",
"klexikon_url": "https://klexikon.zum.de/wiki/ABBA",
"wiki_sentences": [
"ABBA ist eine schwedische Popgruppe, die aus den damaligen Paaren Agnetha Fältskog und Björn Ulvaeus sowie Benny Andersson und Anni-Frid Lyngstad besteht und sich 1972 in Stockholm formierte.",
"Sie gehört mit rund 400 Millionen verkauften Tonträgern zu den erfolgreichsten Bands der Musikgeschichte.",
"Bis in die 1970er Jahre hatte es keine andere Band aus Schweden oder Skandinavien gegeben, der vergleichbare Erfolge gelungen waren.",
"Trotz amerikanischer und britischer Dominanz im Musikgeschäft gelang der Band ein internationaler Durchbruch.",
"Sie hat die Geschichte der Popmusik mitgeprägt.",
"Zu ihren bekanntesten Songs zählen Mamma Mia, Dancing Queen und The Winner Takes It All.",
"1982 beendeten die Gruppenmitglieder aufgrund privater Differenzen ihre musikalische Zusammenarbeit.",
"Seit 2016 arbeiten die vier Musiker wieder zusammen an neuer Musik, die 2021 erscheinen soll.",
],
"klexikon_sentences": [
"ABBA war eine Musikgruppe aus Schweden.",
"Ihre Musikrichtung war die Popmusik.",
"Der Name entstand aus den Anfangsbuchstaben der Vornamen der Mitglieder, Agnetha, Björn, Benny und Anni-Frid.",
"Benny Andersson und Björn Ulvaeus, die beiden Männer, schrieben die Lieder und spielten Klavier und Gitarre.",
"Anni-Frid Lyngstad und Agnetha Fältskog sangen."
]
},
```
### Data Fields
* `u_id` (`int`): A unique identifier for each document pair in the dataset. 0-2349 are reserved for training data, 2350-2623 for testing, and 2364-2897 for validation.
* `title` (`str`): Title of the Klexikon page for this sample.
* `wiki_url` (`str`): URL of the associated Wikipedia article. Notably, this is non-trivial, since we potentially have disambiguated pages, where the Wikipedia title is not exactly the same as the Klexikon one.
* `klexikon_url` (`str`): URL of the Klexikon article.
* `wiki_text` (`List[str]`): List of sentences of the Wikipedia article. We prepare a pre-split document with spacy's sentence splitting (model: `de_core_news_md`). Additionally, please note that we do not include page contents outside of `<p>` tags, which excludes lists, captions and images.
* `klexikon_text` (`List[str]`): List of sentences of the Klexikon article. We apply the same processing as for the Wikipedia texts.
### Data Splits
We provide a stratified split of the dataset, based on the length of the respective Wiki article/Klexikon article pair (according to number of sentences).
The x-axis represents the length of the Wikipedia article, and the y-axis the length of the Klexikon article.
We segment the coordinate systems into rectangles of shape `(100, 10)`, and randomly sample a split of 80/10/10 for training/validation/test from each rectangle to ensure stratification. In case of rectangles with less than 10 entries, we put all samples into training.
The final splits have the following size:
* 2350 samples for training
* 274 samples for validation
* 274 samples for testing
## Dataset Creation
### Curation Rationale
As previously described, the Klexikon resource was created as an attempt to bridge the two fields of text summarization and text simplification. Previous datasets suffer from either one or more of the following shortcomings:
* They primarily focus on input/output pairs of similar lengths, which does not reflect longer-form texts.
* Data exists primarily for English, and other languages are notoriously understudied.
* Alignments exist for sentence-level, but not document-level.
This dataset serves as a starting point to investigate the feasibility of end-to-end simplification systems for longer input documents.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from [Klexikon](klexikon.zum.de), and afterwards aligned with corresponding texts from [German Wikipedia](de.wikipedia.org).
Specifically, the collection process was performed in April 2021, and 3145 articles could be extracted from Klexikon back then. Afterwards, we semi-automatically align the articles with Wikipedia, by looking up articles with the same title.
For articles that do not exactly match, we manually review their content, and decide to match to an appropriate substitute if the content can be matched by at least 66% of the Klexikon paragraphs.
Similarly, we proceed to manually review disambiguation pages on Wikipedia.
We extract only full-text content, excluding figures, captions, and list elements from the final text corpus, and only retain articles for which the respective Wikipedia document consists of at least 15 paragraphs after pre-processing.
#### Who are the source language producers?
The language producers are contributors to Klexikon and Wikipedia. No demographic information was available from the data sources.
### Annotations
#### Annotation process
Annotations were performed by manually reviewing the URLs of the ambiguous article pairs. No annotation platforms or existing tools were used in the process.
Otherwise, articles were matched based on the exact title.
#### Who are the annotators?
The manually aligned articles were reviewed by the dataset author (Dennis Aumiller).
### Personal and Sensitive Information
Since Klexikon and Wikipedia are public encyclopedias, no further personal or sensitive information is included. We did not investigate to what extent information about public figures is included in the dataset.
## Considerations for Using the Data
### Social Impact of Dataset
Accessibility on the web is still a big issue, particularly for disadvantaged readers.
This dataset has the potential to strengthen text simplification systems, which can improve the situation.
In terms of language coverage, this dataset also has a beneficial impact on the availability of German data.
Potential negative biases include the problems of automatically aligned articles. The alignments may never be 100% perfect, and can therefore cause mis-aligned articles (or associations), despite the best of our intentions.
### Discussion of Biases
We have not tested whether any particular bias towards a specific article *type* (i.e., "person", "city", etc.) exists.
Similarly, we attempted to present an unbiased (stratified) split for validation and test set, but given that we only cover around 2900 articles, it is possible that these articles represent a particular focal lense on the overall distribution of lexical content.
### Other Known Limitations
Since the articles were written independently of each other, it is not guaranteed that there exists an exact coverage of each sentence in the simplified article, which could also stem from the fact that sometimes Wikipedia pages have separate article pages for aspects (e.g., the city of "Aarhus" has a separate page for its art museum (ARoS). However, Klexikon lists content and description for ARoS on the page of the city itself.
## Additional Information
### Dataset Curators
The dataset was curated only by the author of this dataset, Dennis Aumiller.
### Licensing Information
Klexikon and Wikipedia make their textual contents available under the CC BY-SA license, which will be inherited for this dataset.
### Citation Information
If you use our dataset or associated code, please cite our paper:
```
@inproceedings{aumiller-gertz-2022-klexikon,
title = "Klexikon: A {G}erman Dataset for Joint Summarization and Simplification",
author = "Aumiller, Dennis and
Gertz, Michael",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.288",
pages = "2693--2701"
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.