datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
rajeshvarma/QA_on_SLA | ---
annotations_creators:
- no-annotations
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- conditional-text-generation
task_ids:
- summarization
---
|
sethapun/arithmetic_2as_1to750 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 61432
num_examples: 2000
- name: validation
num_bytes: 12264
num_examples: 400
download_size: 27596
dataset_size: 73696
---
# Dataset Card for "arithmetic_2as_1to750"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehmetkay-sudo/rpbt | ---
license: gpl-2.0
---
|
MatsuoDochiai/Tifany | ---
license: openrail
---
|
fathyshalab/MDCSI_transport-logistik | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 175764
num_examples: 352
- name: test
num_bytes: 45011
num_examples: 88
download_size: 120959
dataset_size: 220775
---
# Dataset Card for "reklamation24_transport-logistik-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_null_genitive | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 283846
num_examples: 1167
- name: dev_mismatched
num_bytes: 330727
num_examples: 1276
- name: test_matched
num_bytes: 297542
num_examples: 1245
- name: test_mismatched
num_bytes: 343646
num_examples: 1336
- name: train
num_bytes: 12247403
num_examples: 50122
download_size: 8807747
dataset_size: 13503164
---
# Dataset Card for "MULTI_VALUE_mnli_null_genitive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Akshata/autotrain-data-person-name-validity1 | ---
language:
- en
---
# AutoTrain Dataset for project: person-name-validity1
## Dataset Description
This dataset has been automatically processed by AutoTrain for project person-name-validity1.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"divided"
],
"tags": [
0
]
},
{
"tokens": [
"nusrat"
],
"tags": [
1
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(num_classes=2, names=['0', '2'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2499 |
| valid | 499 |
|
open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v4 | ---
pretty_name: Evaluation run of Locutusque/Orca-2-13b-SFT-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Orca-2-13b-SFT-v4](https://huggingface.co/Locutusque/Orca-2-13b-SFT-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T01:19:01.449787](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v4/blob/main/results_2024-02-18T01-19-01.449787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.599714682630193,\n\
\ \"acc_stderr\": 0.03295973692679564,\n \"acc_norm\": 0.6054002601476083,\n\
\ \"acc_norm_stderr\": 0.03365955635648111,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.5115483861843445,\n\
\ \"mc2_stderr\": 0.014747232173396328\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6073491336387173,\n\
\ \"acc_stderr\": 0.004873421833291588,\n \"acc_norm\": 0.7957578171678948,\n\
\ \"acc_norm_stderr\": 0.004023228461406272\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302926,\n \"\
acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302926\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306426,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290923,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290923\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835795,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835795\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560417,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560417\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139946,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139946\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005567,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005567\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.02685729466328141,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.02685729466328141\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886345,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886345\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n\
\ \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n\
\ \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877753,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877753\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.0196758081352815,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.0196758081352815\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.5115483861843445,\n\
\ \"mc2_stderr\": 0.014747232173396328\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625857\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28278999241849884,\n \
\ \"acc_stderr\": 0.012405020417873619\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Orca-2-13b-SFT-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|arc:challenge|25_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|gsm8k|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hellaswag|10_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T01-19-01.449787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T01-19-01.449787.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- '**/details_harness|winogrande|5_2024-02-18T01-19-01.449787.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T01-19-01.449787.parquet'
- config_name: results
data_files:
- split: 2024_02_18T01_19_01.449787
path:
- results_2024-02-18T01-19-01.449787.parquet
- split: latest
path:
- results_2024-02-18T01-19-01.449787.parquet
---
# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13b-SFT-v4](https://huggingface.co/Locutusque/Orca-2-13b-SFT-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T01:19:01.449787](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v4/blob/main/results_2024-02-18T01-19-01.449787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.599714682630193,
"acc_stderr": 0.03295973692679564,
"acc_norm": 0.6054002601476083,
"acc_norm_stderr": 0.03365955635648111,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.5115483861843445,
"mc2_stderr": 0.014747232173396328
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.6073491336387173,
"acc_stderr": 0.004873421833291588,
"acc_norm": 0.7957578171678948,
"acc_norm_stderr": 0.004023228461406272
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306426,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290923,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290923
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835795,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835795
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560417,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560417
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139946,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139946
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005567,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005567
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.02685729466328141,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.02685729466328141
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886345,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886345
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877753,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877753
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.0196758081352815,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.0196758081352815
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.5115483861843445,
"mc2_stderr": 0.014747232173396328
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625857
},
"harness|gsm8k|5": {
"acc": 0.28278999241849884,
"acc_stderr": 0.012405020417873619
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mamiksik/raw-commit-diffs | ---
dataset_info:
features:
- name: language
dtype: string
- name: owner
dtype: string
- name: repo
dtype: string
- name: sha
dtype: string
- name: message
dtype: string
- name: path
dtype: string
- name: patch
dtype: string
- name: is_multipart
dtype: bool
splits:
- name: train
num_bytes: 791921294
num_examples: 399253
download_size: 240089156
dataset_size: 791921294
---
# Dataset Card for "raw-commit-diffs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai2lumos/lumos_complex_qa_plan_iterative | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- language-agent
- reasoning
- question-answering
- planning
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_complex_qa_plan_iterative` is the data for training **planning** module on **complex QA** task in **Lumos-Iterative (Lumos-I)** formulation.
The source of the training annotation training data is shown below:
| Datasets | Number |
|---|---|
|StrategyQA|1777|
|Musique|17632|
## Models Trained with the Data
`lumos_complex_qa_plan_iterative` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_complex_qa_plan_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_plan_iterative) |
|`lumos_complex_qa_plan_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_plan_iterative-13B) |
|`lumos_unified_plan_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_plan_iterative) |
|`lumos_unified_plan_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_plan_iterative-13B) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
heliosprime/twitter_dataset_1713214291 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26182
num_examples: 72
download_size: 21964
dataset_size: 26182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713214291"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katarinayuan/ProtST-BinaryLocalization | ---
configs:
- config_name: default
data_files:
- split: train
path: "binary_localization_train.csv"
- split: validation
path: "binary_localization_valid.csv"
- split: test
path: "binary_localization_test.csv"
--- |
lumenggan/avatar-the-last-airbender-tagged | ---
dataset_info:
features:
- name: image
dtype: image
- name: tags
sequence: string
splits:
- name: train
num_bytes: 1467443424.776
num_examples: 13896
download_size: 1427401832
dataset_size: 1467443424.776
task_categories:
- image-to-text
- image-classification
language:
- en
tags:
- art
- anime
- atla
pretty_name: 'Avatar: The Last Airbender - Tagged Screencaps'
size_categories:
- 1K<n<10K
license: cc
---
# Dataset Card for "avatar-the-last-airbender-tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malteos/germeval2017 | ---
language:
- de
---
# Germeval Task 2017: Shared Task on Aspect-based Sentiment in Social Media Customer Feedback
In the connected, modern world, customer feedback is a valuable source for insights on the quality of products or services. This feedback allows other customers to benefit from the experiences of others and enables businesses to react on requests, complaints or recommendations. However, the more people use a product or service, the more feedback is generated, which results in the major challenge of analyzing huge amounts of feedback in an efficient, but still meaningful way.
Thus, we propose a shared task on automatically analyzing customer reviews about “Deutsche Bahn” - the german public train operator with about two billion passengers each year.
Example:
> “RT @XXX: Da hört jemand in der Bahn so laut ‘700 Main Street’ durch seine Kopfhörer, dass ich mithören kann. :( :( :(“
As shown in the example, insights from reviews can be derived on different granularities. The review contains a general evaluation of the travel (The customer disliked the travel). Furthermore, the review evaluates a dedicated aspect of the train travel (“laut” → customer did not like the noise level).
Consequently, we frame the task as aspect-based sentiment analysis with four sub tasks:
## Data format
```
ID <tab> Text <tab> Relevance <tab> Sentiment <tab> Aspect:Polarity (whitespace separated)
```
## Links
- http://ltdata1.informatik.uni-hamburg.de/germeval2017/
- https://sites.google.com/view/germeval2017-absa/
## How to cite
```bibtex
@inproceedings{germevaltask2017,
title = {{GermEval 2017: Shared Task on Aspect-based Sentiment in Social Media Customer Feedback}},
author = {Michael Wojatzki and Eugen Ruppert and Sarah Holschneider and Torsten Zesch and Chris Biemann},
year = {2017},
booktitle = {Proceedings of the GermEval 2017 – Shared Task on Aspect-based Sentiment in Social Media Customer Feedback},
address={Berlin, Germany},
pages={1--12}
}
``` |
newsmediabias/ReDev | ---
license: mit
pretty_name: bias, fairness, responsible AI, LLM, LLM Hallucinations
---
# Dataset Card for Responsible Development of Language Models (ReDev) Framework Test Suite
## Dataset Description
As the use of large language models (LLMs) increases for everyday tasks, appropriate safeguards must be in place to ensure unbiased and safe output.
There is a need for responsible LLMs that are trained fairly, transparent to the public, and regularly monitored after deployment.
We introduce the _Responsible Development of Language Models (ReDev)_ framework to foster the development of _fair_, _safe_, and _robust_ LLMs for all users.
Here, we present a test suite of unique prompt types to assess LLMs on the aforementioned elements and facilitate research, ensuring all generated responses are non-harmful and free from biased content.
Outputs from four state-of-the-art LLMs, OPT, GPT-3.5, GPT-4, and LLaMA-2, are evaluated by our test suite, highlighting the importance of considering fairness, safety, and robustness at every stage of the machine learning pipeline, including data curation, training, and post-deployment.
**Created by Veronica** |
VivendoDigital/pescarai | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: query_count
dtype: int64
splits:
- name: train
num_bytes: 17680783
num_examples: 1000
download_size: 10076798
dataset_size: 17680783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tr416/dataset_20231006_200650 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73595
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_200650"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
universalner/uner_llm_inst_slovak | ---
license: cc-by-sa-4.0
language:
- sk
task_categories:
- token-classification
dataset_info:
#- config_name: sk_snk
# splits:
# - name: test
# num_examples: 1060
# - name: dev
# num_examples: 1059
# - name: train
# num_examples: 8482
---
# Dataset Card for Universal NER v1 in the Aya format - Slovak subset
This dataset is a format conversion for the Slovak data in the original Universal NER v1 into the Aya instruction format and it's released here under the same CC-BY-SA 4.0 license and conditions.
The dataset contains different subsets and their dev/test/train splits, depending on language. For more details, please refer to:
## Dataset Details
For the original Universal NER dataset v1 and more details, please check https://huggingface.co/datasets/universalner/universal_ner.
For details on the conversion to the Aya instructions format, please see the complete version: https://huggingface.co/datasets/universalner/uner_llm_instructions
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/universalner/uner_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{mayhew2023universal,
title={{Universal NER: A Gold-Standard Multilingual Named Entity Recognition Benchmark}},
author={Stephen Mayhew and Terra Blevins and Shuheng Liu and Marek Šuppa and Hila Gonen and Joseph Marvin Imperial and Börje F. Karlsson and Peiqin Lin and Nikola Ljubešić and LJ Miranda and Barbara Plank and Arij Riabi and Yuval Pinter},
year={2023},
eprint={2311.09122},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
open-llm-leaderboard/details_AA051611__A0118 | ---
pretty_name: Evaluation run of AA051611/A0118
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/A0118](https://huggingface.co/AA051611/A0118) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0118\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T23:48:21.810095](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0118/blob/main/results_2024-01-18T23-48-21.810095.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6750935567286499,\n\
\ \"acc_stderr\": 0.03150224444254494,\n \"acc_norm\": 0.6839013238259298,\n\
\ \"acc_norm_stderr\": 0.03214560635872275,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343144,\n \"mc2\": 0.5579325936654852,\n\
\ \"mc2_stderr\": 0.015526306494139296\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642476,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.0143610972884497\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6517625970922127,\n\
\ \"acc_stderr\": 0.004754380554929216,\n \"acc_norm\": 0.8378809002190799,\n\
\ \"acc_norm_stderr\": 0.0036780679944244557\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\"\
: 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\"\
: 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \
\ \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5952380952380952,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.02203721734026782,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.02203721734026782\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853137,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853137\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7256410256410256,\n \"acc_stderr\": 0.022622765767493214,\n\
\ \"acc_norm\": 0.7256410256410256,\n \"acc_norm_stderr\": 0.022622765767493214\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377342,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377342\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.0313217980308329,\n \"acc_norm\"\
: 0.7254901960784313,\n \"acc_norm_stderr\": 0.0313217980308329\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \"\
acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179337,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179337\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8569604086845466,\n\
\ \"acc_stderr\": 0.012520023176796515,\n \"acc_norm\": 0.8569604086845466,\n\
\ \"acc_norm_stderr\": 0.012520023176796515\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.0239291555173513,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.0239291555173513\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5071707953063885,\n\
\ \"acc_stderr\": 0.012768922739553311,\n \"acc_norm\": 0.5071707953063885,\n\
\ \"acc_norm_stderr\": 0.012768922739553311\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201264,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201264\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174917,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174917\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343144,\n \"mc2\": 0.5579325936654852,\n\
\ \"mc2_stderr\": 0.015526306494139296\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774083\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26383623957543595,\n \
\ \"acc_stderr\": 0.012139386425126807\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/A0118
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|arc:challenge|25_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|arc:challenge|25_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|gsm8k|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|gsm8k|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hellaswag|10_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hellaswag|10_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T16-42-56.875464.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T23-48-21.810095.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- '**/details_harness|winogrande|5_2024-01-18T16-42-56.875464.parquet'
- split: 2024_01_18T23_48_21.810095
path:
- '**/details_harness|winogrande|5_2024-01-18T23-48-21.810095.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T23-48-21.810095.parquet'
- config_name: results
data_files:
- split: 2024_01_18T16_42_56.875464
path:
- results_2024-01-18T16-42-56.875464.parquet
- split: 2024_01_18T23_48_21.810095
path:
- results_2024-01-18T23-48-21.810095.parquet
- split: latest
path:
- results_2024-01-18T23-48-21.810095.parquet
---
# Dataset Card for Evaluation run of AA051611/A0118
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/A0118](https://huggingface.co/AA051611/A0118) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__A0118",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T23:48:21.810095](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0118/blob/main/results_2024-01-18T23-48-21.810095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6750935567286499,
"acc_stderr": 0.03150224444254494,
"acc_norm": 0.6839013238259298,
"acc_norm_stderr": 0.03214560635872275,
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343144,
"mc2": 0.5579325936654852,
"mc2_stderr": 0.015526306494139296
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642476,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.0143610972884497
},
"harness|hellaswag|10": {
"acc": 0.6517625970922127,
"acc_stderr": 0.004754380554929216,
"acc_norm": 0.8378809002190799,
"acc_norm_stderr": 0.0036780679944244557
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026782,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026782
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853137,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853137
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7256410256410256,
"acc_stderr": 0.022622765767493214,
"acc_norm": 0.7256410256410256,
"acc_norm_stderr": 0.022622765767493214
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377342,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377342
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.0313217980308329,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.0313217980308329
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179337,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179337
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8569604086845466,
"acc_stderr": 0.012520023176796515,
"acc_norm": 0.8569604086845466,
"acc_norm_stderr": 0.012520023176796515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.0239291555173513,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.0239291555173513
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5071707953063885,
"acc_stderr": 0.012768922739553311,
"acc_norm": 0.5071707953063885,
"acc_norm_stderr": 0.012768922739553311
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174917,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174917
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343144,
"mc2": 0.5579325936654852,
"mc2_stderr": 0.015526306494139296
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774083
},
"harness|gsm8k|5": {
"acc": 0.26383623957543595,
"acc_stderr": 0.012139386425126807
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anti-ai/ViNLI-SimCSE-supervised_v2 | ---
license: gpl-2.0
task_categories:
- sentence-similarity
- text-retrieval
language:
- vi
size_categories:
- 100K<n<1M
--- |
ASR-HypR/LibriSpeech_withLM | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev_clean
path: data/dev_clean-*
- split: dev_other
path: data/dev_other-*
- split: test_clean
path: data/test_clean-*
- split: test_other
path: data/test_other-*
dataset_info:
features:
- name: utt_id
dtype: string
- name: hyps
sequence: string
- name: att_score
sequence: float64
- name: ctc_score
sequence: float64
- name: score
sequence: float64
- name: ref
dtype: string
- name: lm_score
sequence: float64
splits:
- name: train
num_bytes: 3073751225
num_examples: 281231
- name: dev_clean
num_bytes: 19839669
num_examples: 2703
- name: dev_other
num_bytes: 18981732
num_examples: 2864
- name: test_clean
num_bytes: 19336959
num_examples: 2620
- name: test_other
num_bytes: 19464386
num_examples: 2939
download_size: 879395852
dataset_size: 3151373971
---
# Dataset Card for "LibriSpeech_withLM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shakermaker-1/test-512-lines | ---
license: openrail
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-7b-dolphin_10w-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-7b-dolphin_10w-test](https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T02:39:30.336527](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test/blob/main/results_2023-08-30T02%3A39%3A30.336527.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.441355572237513,\n\
\ \"acc_stderr\": 0.035230208768426426,\n \"acc_norm\": 0.44554035341923276,\n\
\ \"acc_norm_stderr\": 0.03522008237001183,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708728,\n \"mc2\": 0.42075433716694316,\n\
\ \"mc2_stderr\": 0.014621475505823845\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866977,\n\
\ \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536597\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5467038438558056,\n\
\ \"acc_stderr\": 0.004967965810199991,\n \"acc_norm\": 0.7449711212905795,\n\
\ \"acc_norm_stderr\": 0.004349866376068983\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n\
\ \"acc_stderr\": 0.028327743091561067,\n \"acc_norm\": 0.45483870967741935,\n\
\ \"acc_norm_stderr\": 0.028327743091561067\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5958549222797928,\n \"acc_stderr\": 0.0354150857888402,\n\
\ \"acc_norm\": 0.5958549222797928,\n \"acc_norm_stderr\": 0.0354150857888402\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5853211009174312,\n \"acc_stderr\": 0.02112290320860259,\n \"\
acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.02112290320860259\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"\
acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610805,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.43946188340807174,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.43946188340807174,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6079182630906769,\n\
\ \"acc_stderr\": 0.017458524050147636,\n \"acc_norm\": 0.6079182630906769,\n\
\ \"acc_norm_stderr\": 0.017458524050147636\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.026788811931562764,\n\
\ \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.026788811931562764\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017763,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42483660130718953,\n \"acc_stderr\": 0.02830457667314112,\n\
\ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.02830457667314112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0286638201471995,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0286638201471995\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33833116036505867,\n\
\ \"acc_stderr\": 0.012084265626344194,\n \"acc_norm\": 0.33833116036505867,\n\
\ \"acc_norm_stderr\": 0.012084265626344194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.02985526139348392,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.02985526139348392\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4264705882352941,\n \"acc_stderr\": 0.02000791273935936,\n \
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.02000791273935936\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708728,\n \"mc2\": 0.42075433716694316,\n\
\ \"mc2_stderr\": 0.014621475505823845\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:39:30.336527.parquet'
- config_name: results
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- results_2023-08-30T02:39:30.336527.parquet
- split: latest
path:
- results_2023-08-30T02:39:30.336527.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-7b-dolphin_10w-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-7b-dolphin_10w-test](https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T02:39:30.336527](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test/blob/main/results_2023-08-30T02%3A39%3A30.336527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.441355572237513,
"acc_stderr": 0.035230208768426426,
"acc_norm": 0.44554035341923276,
"acc_norm_stderr": 0.03522008237001183,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708728,
"mc2": 0.42075433716694316,
"mc2_stderr": 0.014621475505823845
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866977,
"acc_norm": 0.5170648464163823,
"acc_norm_stderr": 0.014602878388536597
},
"harness|hellaswag|10": {
"acc": 0.5467038438558056,
"acc_stderr": 0.004967965810199991,
"acc_norm": 0.7449711212905795,
"acc_norm_stderr": 0.004349866376068983
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.028327743091561067,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.028327743091561067
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5958549222797928,
"acc_stderr": 0.0354150857888402,
"acc_norm": 0.5958549222797928,
"acc_norm_stderr": 0.0354150857888402
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.02112290320860259,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.02112290320860259
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610805,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.43946188340807174,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.43946188340807174,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03088273697413866,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03088273697413866
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6079182630906769,
"acc_stderr": 0.017458524050147636,
"acc_norm": 0.6079182630906769,
"acc_norm_stderr": 0.017458524050147636
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.026788811931562764,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.026788811931562764
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017763,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.02830457667314112,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.02830457667314112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0286638201471995,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0286638201471995
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33833116036505867,
"acc_stderr": 0.012084265626344194,
"acc_norm": 0.33833116036505867,
"acc_norm_stderr": 0.012084265626344194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.02985526139348392,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.02985526139348392
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.02000791273935936,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.02000791273935936
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708728,
"mc2": 0.42075433716694316,
"mc2_stderr": 0.014621475505823845
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mooggsentry/mooggsentryRVC | ---
license: openrail
---
|
Nadav/pixel_glue_stsb | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: float32
splits:
- name: train
num_bytes: 82776786.375
num_examples: 5749
- name: validation
num_bytes: 17450726.5
num_examples: 1500
download_size: 97774662
dataset_size: 100227512.875
---
# Dataset Card for "pixel_glue_stsb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b | ---
pretty_name: Evaluation run of shareAI/bimoGPT-llama2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shareAI/bimoGPT-llama2-13b](https://huggingface.co/shareAI/bimoGPT-llama2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:26:34.744739](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b/blob/main/results_2023-09-17T02-26-34.744739.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298278,\n \"f1\": 0.05842386744966444,\n\
\ \"f1_stderr\": 0.0013305449660371358,\n \"acc\": 0.43888155205954144,\n\
\ \"acc_stderr\": 0.01031967359624197\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298278,\n\
\ \"f1\": 0.05842386744966444,\n \"f1_stderr\": 0.0013305449660371358\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833059\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650882\n\
\ }\n}\n```"
repo_url: https://huggingface.co/shareAI/bimoGPT-llama2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|arc:challenge|25_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_26_34.744739
path:
- '**/details_harness|drop|3_2023-09-17T02-26-34.744739.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-26-34.744739.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_26_34.744739
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-26-34.744739.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-26-34.744739.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hellaswag|10_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T18:04:32.310000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T18:04:32.310000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_26_34.744739
path:
- '**/details_harness|winogrande|5_2023-09-17T02-26-34.744739.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-26-34.744739.parquet'
- config_name: results
data_files:
- split: 2023_08_09T18_04_32.310000
path:
- results_2023-08-09T18:04:32.310000.parquet
- split: 2023_09_17T02_26_34.744739
path:
- results_2023-09-17T02-26-34.744739.parquet
- split: latest
path:
- results_2023-09-17T02-26-34.744739.parquet
---
# Dataset Card for Evaluation run of shareAI/bimoGPT-llama2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shareAI/bimoGPT-llama2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [shareAI/bimoGPT-llama2-13b](https://huggingface.co/shareAI/bimoGPT-llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:26:34.744739](https://huggingface.co/datasets/open-llm-leaderboard/details_shareAI__bimoGPT-llama2-13b/blob/main/results_2023-09-17T02-26-34.744739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298278,
"f1": 0.05842386744966444,
"f1_stderr": 0.0013305449660371358,
"acc": 0.43888155205954144,
"acc_stderr": 0.01031967359624197
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298278,
"f1": 0.05842386744966444,
"f1_stderr": 0.0013305449660371358
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833059
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650882
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mxronga/yoruba-proverbs-parallel-corpora | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
---
Parralel corpora for yoruba to english.
Source: http://yoruba.unl.edu/yoruba1.html |
liuyanchen1015/MULTI_VALUE_qqp_past_for_past_participle | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 271095
num_examples: 1480
- name: test
num_bytes: 2533764
num_examples: 13791
- name: train
num_bytes: 2334435
num_examples: 12752
download_size: 3194795
dataset_size: 5139294
---
# Dataset Card for "MULTI_VALUE_qqp_past_for_past_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hodgesz/validated-sql-create-context | ---
license: apache-2.0
---
|
Rafitrians/Ajijoy | ---
license: other
---
|
Severian/Bio-Design-Process | ---
license: artistic-2.0
---
This dataset works even though it may not be the cleanest in regards to organization. I'm working on cleaning it up for better performance, but it should still work as long as you don't overtrain on it. |
yangyz1230/H3_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 299549
num_examples: 545
- name: test
num_bytes: 33279
num_examples: 61
download_size: 164991
dataset_size: 332828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_A_D_PNP_GENERIC_C_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 140863
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 141014
num_examples: 1000
download_size: 105742
dataset_size: 281877
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_A_D_PNP_GENERIC_C_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ambet/chat_style_conversation | ---
dataset_info:
features:
- name: 'text '
dtype: string
- name: __index_level_0__
dtype: string
- name: __index_level_1__
dtype: string
- name: __index_level_2__
dtype: string
splits:
- name: train
num_bytes: 269427.3133781307
num_examples: 1227
- name: test
num_bytes: 90028.68662186927
num_examples: 410
download_size: 163086
dataset_size: 359456.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LanguageBind/Video-Bench | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Joseph717171__Tess-10.7B-v2.0 | ---
pretty_name: Evaluation run of Joseph717171/Tess-10.7B-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Joseph717171/Tess-10.7B-v2.0](https://huggingface.co/Joseph717171/Tess-10.7B-v2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Joseph717171__Tess-10.7B-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T16:07:29.849184](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Tess-10.7B-v2.0/blob/main/results_2024-03-30T16-07-29.849184.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4992826261878715,\n\
\ \"acc_stderr\": 0.03426501694925459,\n \"acc_norm\": 0.5059033509632049,\n\
\ \"acc_norm_stderr\": 0.0350107347322564,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4462553482634578,\n\
\ \"mc2_stderr\": 0.0158028727847505\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n\
\ \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5668193586934873,\n\
\ \"acc_stderr\": 0.0049450236570322765,\n \"acc_norm\": 0.7439753037243577,\n\
\ \"acc_norm_stderr\": 0.004355436696716298\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n\
\ \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681906,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681906\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\
\ \"acc_stderr\": 0.028040981380761547,\n \"acc_norm\": 0.5838709677419355,\n\
\ \"acc_norm_stderr\": 0.028040981380761547\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6825688073394496,\n \"acc_stderr\": 0.019957152198460493,\n \"\
acc_norm\": 0.6825688073394496,\n \"acc_norm_stderr\": 0.019957152198460493\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.0325685057029365,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.0325685057029365\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.0343413116471913,\n \"acc_norm\"\
: 0.6029411764705882,\n \"acc_norm_stderr\": 0.0343413116471913\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \"acc_norm\"\
: 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.0435644720266507,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.0435644720266507\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560392,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560392\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5759897828863346,\n\
\ \"acc_stderr\": 0.017672263329084226,\n \"acc_norm\": 0.5759897828863346,\n\
\ \"acc_norm_stderr\": 0.017672263329084226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943677,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943677\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808842,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808842\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332687,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332687\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930477,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930477\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3683181225554107,\n\
\ \"acc_stderr\": 0.012319403369564639,\n \"acc_norm\": 0.3683181225554107,\n\
\ \"acc_norm_stderr\": 0.012319403369564639\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3860294117647059,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.3860294117647059,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.020175488765484043,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.020175488765484043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.03753638955761691,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.03753638955761691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839673,\n \"mc2\": 0.4462553482634578,\n\
\ \"mc2_stderr\": 0.0158028727847505\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.013380909249751242\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17589082638362397,\n \
\ \"acc_stderr\": 0.01048712063553962\n }\n}\n```"
repo_url: https://huggingface.co/Joseph717171/Tess-10.7B-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-07-29.849184.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-07-29.849184.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- '**/details_harness|winogrande|5_2024-03-30T16-07-29.849184.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T16-07-29.849184.parquet'
- config_name: results
data_files:
- split: 2024_03_30T16_07_29.849184
path:
- results_2024-03-30T16-07-29.849184.parquet
- split: latest
path:
- results_2024-03-30T16-07-29.849184.parquet
---
# Dataset Card for Evaluation run of Joseph717171/Tess-10.7B-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Joseph717171/Tess-10.7B-v2.0](https://huggingface.co/Joseph717171/Tess-10.7B-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Joseph717171__Tess-10.7B-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T16:07:29.849184](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Tess-10.7B-v2.0/blob/main/results_2024-03-30T16-07-29.849184.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4992826261878715,
"acc_stderr": 0.03426501694925459,
"acc_norm": 0.5059033509632049,
"acc_norm_stderr": 0.0350107347322564,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4462553482634578,
"mc2_stderr": 0.0158028727847505
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097662
},
"harness|hellaswag|10": {
"acc": 0.5668193586934873,
"acc_stderr": 0.0049450236570322765,
"acc_norm": 0.7439753037243577,
"acc_norm_stderr": 0.004355436696716298
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681906,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681906
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736236,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6825688073394496,
"acc_stderr": 0.019957152198460493,
"acc_norm": 0.6825688073394496,
"acc_norm_stderr": 0.019957152198460493
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.0325685057029365,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.0325685057029365
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.0343413116471913,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.0343413116471913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.0435644720266507,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.0435644720266507
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560392,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560392
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5759897828863346,
"acc_stderr": 0.017672263329084226,
"acc_norm": 0.5759897828863346,
"acc_norm_stderr": 0.017672263329084226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943677,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808842,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808842
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332687,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332687
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930477,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930477
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3683181225554107,
"acc_stderr": 0.012319403369564639,
"acc_norm": 0.3683181225554107,
"acc_norm_stderr": 0.012319403369564639
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3860294117647059,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.3860294117647059,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.020175488765484043,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.020175488765484043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.03753638955761691,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.03753638955761691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839673,
"mc2": 0.4462553482634578,
"mc2_stderr": 0.0158028727847505
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.013380909249751242
},
"harness|gsm8k|5": {
"acc": 0.17589082638362397,
"acc_stderr": 0.01048712063553962
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MaralGPT/chinkara_open_qa | ---
license: mit
language:
- en
---
# Chinkara Open Question Answering Dataset
For more information on this dataset and how we made it, please visit our [Github page](https://github.com/prp-e/chinkara-openqa-dataset). |
tyzhu/squad_no_title_strict_v4_train_30_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 510871.98481973435
num_examples: 368
- name: validation
num_bytes: 48707
num_examples: 50
download_size: 81258
dataset_size: 559578.9848197344
---
# Dataset Card for "squad_no_title_strict_v4_train_30_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aengusl/test_hp_task_unlrn_ds | ---
dataset_info:
features:
- name: question
dtype: string
- name: true_answer
dtype: string
- name: false_answer
dtype: string
splits:
- name: train
num_bytes: 40190
num_examples: 312
download_size: 22762
dataset_size: 40190
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Minner/DataSetting | ---
license: unknown
---
|
vwxyzjn/cai-conversation-dev1705628758 | ---
dataset_info:
features:
- name: init_prompt
dtype: string
- name: init_response
dtype: string
- name: critic_prompt
dtype: string
- name: critic_response
dtype: string
- name: revision_prompt
dtype: string
- name: revision_response
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 240834
num_examples: 64
- name: train_prefs
num_bytes: 238847
num_examples: 64
- name: test_sft
num_bytes: 254525
num_examples: 64
- name: test_prefs
num_bytes: 257663
num_examples: 64
download_size: 538984
dataset_size: 991869
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
- split: test_sft
path: data/test_sft-*
- split: test_prefs
path: data/test_prefs-*
---
# Dataset Card for "cai-conversation-dev1705628758"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/CSIC_BERT_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115621178.4375
num_examples: 37500
- name: test
num_bytes: 38540392.5
num_examples: 12500
download_size: 211874011
dataset_size: 154161570.9375
---
# Dataset Card for "CSIC_BERT_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hails/agieval-sat-en | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 1019867
num_examples: 206
download_size: 265934
dataset_size: 1019867
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "agieval-sat-en"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the SAT-en subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
@misc
{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-49000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1001877
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kayteekay/bookimg_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 289585512.68
num_examples: 32581
download_size: 0
dataset_size: 289585512.68
---
# Dataset Card for "bookimg_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathanli/echr | ---
license: cc-by-nc-sa-4.0
---
# ECHR Cases
The original data from [Chalkidis et al.](https://arxiv.org/abs/1906.02059), sourced from [archive.org](https://archive.org/details/ECHR-ACL2019).
## Preprocessing
* Order is shuffled
* Fact numbers preceeding each fact are removed (using the python regex `^[0-9]+\. `), as some cases didn't have fact numbers to begin with
* Everything else is the same
|
BubbleJoe/snli_unified_input | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: input
dtype: string
splits:
- name: test
num_bytes: 2617808
num_examples: 10000
- name: train
num_bytes: 137270292
num_examples: 550152
- name: validation
num_bytes: 2626072
num_examples: 10000
download_size: 40318460
dataset_size: 142514172
---
# Dataset Card for "snli_unified_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehdidc/compositionality_tifa | ---
dataset_info:
features:
- name: caption
dtype: string
- name: caption_source
dtype: string
- name: image_0_url
dtype: string
- name: image_1_url
dtype: string
- name: label_0
dtype: float64
- name: label_1
dtype: float64
- name: num_example_per_prompt
dtype: int64
- name: model_0
dtype: string
- name: model_1
dtype: string
- name: jpg_0
dtype: binary
- name: jpg_1
dtype: binary
- name: are_different
dtype: bool
- name: has_label
dtype: bool
- name: origin
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 141382592.0
num_examples: 1600
download_size: 140531023
dataset_size: 141382592.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/meili_portroute_rezero | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of meili_portroute (Re:Zero Kara Hajimeru Isekai Seikatsu)
This is the dataset of meili_portroute (Re:Zero Kara Hajimeru Isekai Seikatsu), containing 23 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
skeskinen/TinyStories-hf | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1911420483
num_examples: 2119719
- name: validation
num_bytes: 19306310
num_examples: 21990
download_size: 1000775442
dataset_size: 1930726793
---
A description of this dataset can be found at https://arxiv.org/abs/2305.07759
Copied from roneneldan/TinyStories
Modified with:
```
import ftfy.bad_codecs
from datasets import Dataset, DatasetDict
train = open('./TinyStories-train.txt', 'r', encoding='sloppy-windows-1252').read()
train = train.split('<|endoftext|>')
train = [l.strip() for l in train]
valid = open('./TinyStories-valid.txt', 'r', encoding='sloppy-windows-1252').read()
valid = valid.split('<|endoftext|>')
valid = [l.strip() for l in valid]
dataset = DatasetDict({
'train': Dataset.from_dict({'text': train }),
'validation': Dataset.from_dict({'text': valid}),
})
dataset.save_to_disk('./TinyStories')
``` |
tyzhu/find_first_sent_train_50_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 182048
num_examples: 140
- name: validation
num_bytes: 39025
num_examples: 40
download_size: 0
dataset_size: 221073
---
# Dataset Card for "find_first_sent_train_50_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/NQ-colbert-top-10 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 574077206
num_examples: 87925
- name: test
num_bytes: 23673906
num_examples: 3610
download_size: 340649717
dataset_size: 597751112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ajibawa-2023/WikiHow | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
- text2text-generation
language:
- en
size_categories:
- 100K<n<1M
---
**WikiHow**
A large corpus of WikiHow having more than 175000 entries. It is very useful for edutional training purpose.
Data is in Json format. |
liuyanchen1015/MULTI_VALUE_rte_object_pronoun_drop | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 99406
num_examples: 193
- name: train
num_bytes: 92593
num_examples: 182
download_size: 139152
dataset_size: 191999
---
# Dataset Card for "MULTI_VALUE_rte_object_pronoun_drop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual_babylm_without_prototypical | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581830133
num_examples: 11633299
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421630845
dataset_size: 637950363
---
# Dataset Card for "counterfactual_babylm_without_prototypical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Wi11Chan/glhwater_crop3200 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: training
num_bytes: 4180365155.2
num_examples: 3200
- name: validation
num_bytes: 441798907.0
num_examples: 400
- name: testing
num_bytes: 441798907.0
num_examples: 400
download_size: 5209227311
dataset_size: 5063962969.2
---
# Dataset Card for "glhwater_crop3200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/bw_spec_cls_80_02 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '4071'
'1': '4072'
'2': '4073'
'3': '4074'
'4': '4075'
'5': '4076'
'6': '4077'
'7': '4078'
'8': '4079'
'9': '4080'
'10': '4091'
'11': '4092'
'12': '4093'
'13': '4094'
'14': '4095'
'15': '4096'
'16': '4097'
'17': '4098'
'18': '4099'
'19': '4100'
'20': '4101'
'21': '4102'
'22': '4103'
'23': '4232'
'24': '4233'
'25': '4234'
'26': '4235'
'27': '4236'
'28': '4239'
'29': '4450'
'30': '4507'
'31': '4508'
'32': '4509'
'33': '4510'
'34': '4511'
'35': '4519'
'36': '4520'
'37': '4521'
'38': '4522'
'39': '4682'
'40': '4684'
'41': '4685'
'42': '4688'
'43': '4777'
'44': '4778'
'45': '4779'
'46': '4780'
'47': '4781'
'48': '4782'
'49': '4784'
'50': '4785'
'51': '4786'
'52': '4787'
'53': '4788'
'54': '4799'
'55': '4835'
'56': '4836'
'57': '4838'
'58': '4846'
'59': '4848'
'60': '4849'
'61': '5006'
'62': '5156'
'63': '5157'
'64': '5158'
'65': '5159'
'66': '5169'
'67': '5170'
'68': '5171'
'69': '5264'
'70': '5268'
'71': '5376'
'72': '5381'
'73': '5521'
'74': '5879'
'75': '5936'
'76': '5940'
'77': '6329'
'78': '6330'
'79': '6332'
splits:
- name: train
num_bytes: 91635203.2
num_examples: 1600
download_size: 90024456
dataset_size: 91635203.2
---
# Dataset Card for "bw_spec_cls_80_02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Partha117/oss_bugs_with_content | ---
dataset_info:
features:
- name: status
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: issue_id
dtype: int64
- name: title
dtype: string
- name: body
dtype: string
- name: issue_url
dtype: string
- name: pull_url
dtype: string
- name: before_fix_sha
dtype: string
- name: after_fix_sha
dtype: string
- name: report_datetime
dtype: timestamp[us, tz=UTC]
- name: language
dtype: string
- name: commit_datetime
dtype: timestamp[us, tz=UTC]
- name: updated_file
dtype: string
- name: file_content
dtype: string
splits:
- name: train
num_bytes: 2522992380
num_examples: 77121
download_size: 678968673
dataset_size: 2522992380
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
am-not-a-scientist/test1 | ---
license: unknown
---
|
MohammedNasri/train_prepared | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 39269091968
num_examples: 40880
download_size: 6221847206
dataset_size: 39269091968
---
# Dataset Card for "train_prepared"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ume_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ume/梅 (Kantai Collection)
This is the dataset of ume/梅 (Kantai Collection), containing 81 images and their tags.
The core tags of this character are `green_hair, multicolored_hair, pink_hair, ahoge, short_hair, streaked_hair, glasses, pink-framed_eyewear, breasts, large_breasts, ponytail, red_eyes, red_bow, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 81 | 86.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ume_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 81 | 50.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ume_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 181 | 102.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ume_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 81 | 77.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ume_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 181 | 147.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ume_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ume_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 54 |  |  |  |  |  | serafuku, grey_sailor_collar, 1girl, sailor_shirt, juliet_sleeves, solo, corset, green_shirt, bowtie, blouse, white_background, grey_skirt, simple_background, pleated_skirt, upper_body, smile, looking_at_viewer |
| 1 | 17 |  |  |  |  |  | 1girl, solo, simple_background, looking_at_viewer, white_background, bob_cut, alternate_costume, cowboy_shot, dated, one-hour_drawing_challenge |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | serafuku | grey_sailor_collar | 1girl | sailor_shirt | juliet_sleeves | solo | corset | green_shirt | bowtie | blouse | white_background | grey_skirt | simple_background | pleated_skirt | upper_body | smile | looking_at_viewer | bob_cut | alternate_costume | cowboy_shot | dated | one-hour_drawing_challenge |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:---------------------|:--------|:---------------|:-----------------|:-------|:---------|:--------------|:---------|:---------|:-------------------|:-------------|:--------------------|:----------------|:-------------|:--------|:--------------------|:----------|:--------------------|:--------------|:--------|:-----------------------------|
| 0 | 54 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 17 |  |  |  |  |  | | | X | | | X | | | | | X | | X | | | | X | X | X | X | X | X |
|
yzhuang/autotree_pmlb_ring_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 308080000
num_examples: 10000
- name: validation
num_bytes: 308080000
num_examples: 10000
download_size: 206130038
dataset_size: 616160000
---
# Dataset Card for "autotree_pmlb_ring_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-54000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 652419
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
btyt7/the-italian-cook-book | ---
license: openrail
task_categories:
- text-generation
language:
- en
tags:
- food
- recipe
pretty_name: 'Gutenberg: The Italian Cook Book'
size_categories:
- n<1K
--- |
jeevan-23/w9_train_test_split | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: query
struct:
- name: question
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: int64
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 124487484.92307693
num_examples: 180
- name: test
num_bytes: 13831942.76923077
num_examples: 20
download_size: 148303453
dataset_size: 138319427.6923077
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
larryvrh/belle_filtered_2.8M_CN | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 4151854934
num_examples: 2802515
download_size: 2513439396
dataset_size: 4151854934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: gpl-3.0
task_categories:
- text-generation
- conversational
language:
- zh
size_categories:
- 1M<n<10M
---
# Dataset Card for "belle_filtered_2.8M_CN"
通过启发式方法过滤后的[BelleGroup/train_3.5M_CN](https://huggingface.co/datasets/BelleGroup/train_3.5M_CN),移除了语料中明显的错误和低质量内容。
## 样例
```
{
"id":"66623028",
"conversations":[
{
"from":"human",
"value":"归纳介绍三种不同类型的电影题材"
},
{
"from":"assistant",
"value":"1. 动作片:动作片通常强调动作和战斗场面,故事情节紧凑,充满了紧张和兴奋的氛围。主角一般是具有超凡实力或特殊技能的英雄人物,往往会与反派角色进行一系列激烈的格斗,追逐和爆破等酷炫场面。例如《速度与激情》和《复仇者联盟》等。\n2. 爱情片:爱情片是以爱情为主题的电影,着重呈现角色之间的感情纠葛,展现了爱情的美好和悲壮。情节与角色之间的发展令人深思,增加了观众的感性体验。这种电影的故事情节主要以人类与人类之间的情感变化为主题。例如《泰坦尼克号》和《让子弹飞》等。\n3. 科幻片:科幻片是以未来科技和未知世界为主题的电影,通常包括时间旅行、外星人、虚拟现实、未来社会等各种奇幻的元素。这种电影描绘了一种比现实更加宏伟、更加神秘和惊奇的世界。例如《星际穿越》和《发条橙》等。"
}
]
}
{
"id":"18205437",
"conversations":[
{
"from":"human",
"value":"根据给定的食材,编写一道美食菜谱。\n鸡腿肉、蘑菇、洋葱"
},
{
"from":"assistant",
"value":"鸡肉炒蘑菇洋葱\n食材:\n- 鸡腿肉 400克,去骨去皮切成小块\n- 蘑菇 200克,切片\n- 洋葱 1个,切丝\n- 生姜 适量,切丝\n- 酱油 2勺\n- 料酒 1勺\n- 盐 适量\n- 白胡椒粉 适量\n- 植物油 适量\n做法:\n1. 在一个碗里放入切好的鸡腿肉块,加入酱油、料酒和适量的白胡椒粉,搅拌均匀,腌制10分钟。\n2. 热锅凉油,加入姜丝翻炒,放入腌好的鸡肉块翻炒至变色。\n3. 加入洋葱丝和蘑菇,翻炒2-3分钟至蘑菇变软。\n4. 加入适量的盐和少量的水,盖上锅盖焖煮5分钟直至鸡肉熟透。\n5. 最后翻炒几下,使汤汁变浓稠即可。\n温馨提示:\n1. 在腌制鸡肉时,如果想让肉质更嫩滑,可以加入少量的淀粉和蛋清。\n2. 可以根据个人口味加入其他调料,如蒜末、辣椒等。\n3. 洋葱和蘑菇可以用其他蔬菜代替,如青椒、胡萝卜等。"
},
{
"from":"human",
"value":"太好了,我会试着照着你教的做一次,但是我最近买了一部新相机,想让你推荐一些好的拍照地点。"
},
{
"from":"assistant",
"value":"当地的风景区、公园、景点等等都是比较适合拍照的地点。如果你想要一些特别的地方,可以考虑一些建筑群,街道巷子、咖啡馆、书店等具有文艺气息的地方。"
},
{
"from":"human",
"value":"嗯,这些地方都很不错,但是我还想找一些可以体验当地文化和风俗的地点,你有什么好的建议吗?"
},
{
"from":"assistant",
"value":"如果你想要更深入地了解当地文化和风俗,可以考虑参加一些游学活动,比如参观当地的博物馆、参加民俗节庆、尝试当地美食等等。此外,也可以去一些小城镇,感受那里的古朴风情。"
}
]
}
```
### 字段:
```
id: 数据id
conversations: 数据内容,以对话形式给出,包括多轮和单轮对话的数据
category: 数据类别
```
过滤后的数据构成:
| Category | Count |
|----------------|---------:|
| close qa | 112,570 |
| classification | 125,623 |
| extract | 6,400 |
| open qa | 385,306 |
| harmless | 45,968 |
| role playing | 465,782 |
| rewrite | 28,146 |
| code | 180,825 |
| translation | 29,923 |
| summarization | 99,017 |
| math | 106,202 |
| generation |1,023,643 |
| brainstorming | 193,110 | |
DIBT/MPEP_DUTCH | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for MPEP_DUTCH
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("DIBT/MPEP_DUTCH")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("DIBT/MPEP_DUTCH")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| source | Source | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| target | Vertaling | text | True | Vertaal de prompt naar het Nederlands. | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "1788",
"fields": {
"source": "As an AI enthusiast, you love creating programs that can understand human language. Your latest project involves creating a program that can identify and replace words with their antonyms in a given text. \nTo demonstrate the effectiveness of your program, you decide to test it on a news article about a recent political event. However, to make it more challenging, you also want your program to differentiate between homonyms and use context clues to correctly replace them.\nHere\u0027s a step-by-step explanation of how your program works:\n1. The program reads the input text and identifies all the words that have antonyms.\n2. For each of these words, the program identifies the context in which it appears to determine the correct antonym to use.\n3. The program then replaces the original word with its antonym in the text.\n4. If the original word has multiple meanings, the program uses the context to determine which meaning is intended and replace it with the appropriate antonym.\n5. Finally, the program outputs the modified text with the replaced words.\nCan you put your natural language processing skills to the test and try to identify the antonyms used in the modified text?"
},
"metadata": {
"evolved_from": null,
"kind": "synthetic",
"source": "evol_instruct"
},
"responses": [
{
"status": "submitted",
"user_id": "315be0ef-8df8-48df-8ba1-ff3339ef016a",
"values": {
"target": {
"value": "Als een AI-enthousiasteling, houd je ervan om programma\u0027s te maken die de menselijke taal begrijpen. Je nieuwste project is een programma dat woorden kan herkennen en vervangen door hun antoniemen in een stuk tekst.\nOm de effectiviteit van je programma aan te tonen, besluit je het te testen op een nieuwsartikel over een recent politiek evenement. Om het uitdagender te maken, wil je ook dat je programma onderscheid maakt tussen homoniemen, en daarnaast contextuele aanwijzingen gebruikt woorden correct te vervangen.\nHier is een stapsgewijze uitleg van hoe je programma werkt:\n1. Het programma leest de inputtekst en identificeert alle woorden die antoniemen hebben.\n2. Voor elk van deze woorden identificeert het programma de context waarin het verschijnt om het juiste antoniem te bepalen.\n3. Het programma vervangt vervolgens het originele woord door zijn antoniem in de tekst.\n4. Als het originele woord meerdere betekenissen heeft, gebruikt het programma de context om te bepalen welke betekenis bedoeld is en vervangt het door het passende antoniem.\n5. Uiteindelijk geeft het programma de aangepaste tekst uit met de vervangen woorden.\nKun je je vaardigheden in natuurlijke taalverwerking op de proef stellen en proberen de antoniemen te identificeren die in de gewijzigde tekst worden gebruikt?"
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "target",
"score": null,
"type": null,
"value": "Als een AI-enthousiasteling, houd je ervan om programma\u0027s te maken die de menselijke taal begrijpen. Je nieuwste project houdt in dat je een programma maakt dat woorden kan identificeren en vervangen door hun antoniemen in een gegeven tekst.\nOm de effectiviteit van je programma aan te tonen, besluit je het te testen op een nieuwsartikel over een recent politiek evenement. Om het echter uitdagender te maken, wil je ook dat je programma onderscheid maakt tussen homoniemen en contextuele aanwijzingen gebruikt om ze correct te vervangen.\nHier is een stapsgewijze uitleg van hoe je programma werkt:\n1. Het programma leest de inputtekst en identificeert alle woorden die antoniemen hebben.\n2. Voor elk van deze woorden identificeert het programma de context waarin het verschijnt om het juiste antoniem te bepalen.\n3. Het programma vervangt vervolgens het originele woord door zijn antoniem in de tekst.\n4. Als het originele woord meerdere betekenissen heeft, gebruikt het programma de context om te bepalen welke betekenis bedoeld is en vervangt het door het passende antoniem.\n5. Uiteindelijk geeft het programma de aangepaste tekst uit met de vervangen woorden.\nKun je je vaardigheden in natuurlijke taalverwerking op de proef stellen en proberen de antoniemen te identificeren die in de gewijzigde tekst worden gebruikt?"
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "1788",
"metadata": "{\"evolved_from\": null, \"kind\": \"synthetic\", \"source\": \"evol_instruct\"}",
"source": "As an AI enthusiast, you love creating programs that can understand human language. Your latest project involves creating a program that can identify and replace words with their antonyms in a given text. \nTo demonstrate the effectiveness of your program, you decide to test it on a news article about a recent political event. However, to make it more challenging, you also want your program to differentiate between homonyms and use context clues to correctly replace them.\nHere\u0027s a step-by-step explanation of how your program works:\n1. The program reads the input text and identifies all the words that have antonyms.\n2. For each of these words, the program identifies the context in which it appears to determine the correct antonym to use.\n3. The program then replaces the original word with its antonym in the text.\n4. If the original word has multiple meanings, the program uses the context to determine which meaning is intended and replace it with the appropriate antonym.\n5. Finally, the program outputs the modified text with the replaced words.\nCan you put your natural language processing skills to the test and try to identify the antonyms used in the modified text?",
"target": [
{
"status": "submitted",
"user_id": "315be0ef-8df8-48df-8ba1-ff3339ef016a",
"value": "Als een AI-enthousiasteling, houd je ervan om programma\u0027s te maken die de menselijke taal begrijpen. Je nieuwste project is een programma dat woorden kan herkennen en vervangen door hun antoniemen in een stuk tekst.\nOm de effectiviteit van je programma aan te tonen, besluit je het te testen op een nieuwsartikel over een recent politiek evenement. Om het uitdagender te maken, wil je ook dat je programma onderscheid maakt tussen homoniemen, en daarnaast contextuele aanwijzingen gebruikt woorden correct te vervangen.\nHier is een stapsgewijze uitleg van hoe je programma werkt:\n1. Het programma leest de inputtekst en identificeert alle woorden die antoniemen hebben.\n2. Voor elk van deze woorden identificeert het programma de context waarin het verschijnt om het juiste antoniem te bepalen.\n3. Het programma vervangt vervolgens het originele woord door zijn antoniem in de tekst.\n4. Als het originele woord meerdere betekenissen heeft, gebruikt het programma de context om te bepalen welke betekenis bedoeld is en vervangt het door het passende antoniem.\n5. Uiteindelijk geeft het programma de aangepaste tekst uit met de vervangen woorden.\nKun je je vaardigheden in natuurlijke taalverwerking op de proef stellen en proberen de antoniemen te identificeren die in de gewijzigde tekst worden gebruikt?"
}
],
"target-suggestion": "Als een AI-enthousiasteling, houd je ervan om programma\u0027s te maken die de menselijke taal begrijpen. Je nieuwste project houdt in dat je een programma maakt dat woorden kan identificeren en vervangen door hun antoniemen in een gegeven tekst.\nOm de effectiviteit van je programma aan te tonen, besluit je het te testen op een nieuwsartikel over een recent politiek evenement. Om het echter uitdagender te maken, wil je ook dat je programma onderscheid maakt tussen homoniemen en contextuele aanwijzingen gebruikt om ze correct te vervangen.\nHier is een stapsgewijze uitleg van hoe je programma werkt:\n1. Het programma leest de inputtekst en identificeert alle woorden die antoniemen hebben.\n2. Voor elk van deze woorden identificeert het programma de context waarin het verschijnt om het juiste antoniem te bepalen.\n3. Het programma vervangt vervolgens het originele woord door zijn antoniem in de tekst.\n4. Als het originele woord meerdere betekenissen heeft, gebruikt het programma de context om te bepalen welke betekenis bedoeld is en vervangt het door het passende antoniem.\n5. Uiteindelijk geeft het programma de aangepaste tekst uit met de vervangen woorden.\nKun je je vaardigheden in natuurlijke taalverwerking op de proef stellen en proberen de antoniemen te identificeren die in de gewijzigde tekst worden gebruikt?",
"target-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **source** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **target** is of type `text`, and description "Vertaal de prompt naar het Nederlands.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **target-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
## Leaderboard
Zie de voortgang op het [Leaderboard](https://huggingface.co/spaces/DIBT-Dutch/prompt-translation).
## Hulp bij het vertalen
- Er staat een vertaalsuggestie klaar die door GPT-4 gemaakt is. Deze mag je aanpassen zoveel als je wilt, of je mag hem verwijderen en helemaal zelf vertalen.
- Je wordt aangemoedigd om het taalgebruik aan te passen om de prompt voor jou natuurlijker te laten klinken
- Prompts die erg specifieke vragen stellen over niet-Nederlandstalige gebieden mag je vernederlandsen. Bijvoorbeeld: in plaats van Amerikaanse staten kun je Nederlandse of Belgische provincies gebruiken.
- Maak zelf een inschatting of en hoe je bepaalde technische termen (bijvoorbeeld wetenschappelijke termen of programmeertermen) wilt vertalen. Wetenschappelijke termen in het Nederlands kan je vaak het makkelijkst terugvinden door de Engelstalige Wikipedia-pagina op te zoeken, en vanaf daar door te klikken naar de Nederlandstalige variant.
- Soms heeft GPT-4 de prompt ook uitgevoerd in plaats van alleen vertaald. In dat geval mag je de uitvoer gewoon verwijderen.
## Discord
Vragen of onduidelijkheden? Kom ons opzoeken in het [Discord-kanaal](https://discord.com/channels/879548962464493619/1217585262595408034)!
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HuggingFaceM4/imagenet1k_support_1k_query_sets_part_5 | Invalid username or password. |
RanchiZhao/c4 | ---
license: apache-2.0
---
|
victorzarzu/interior-design-prompt-editing-dataset-train | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: input_prompt
dtype: string
- name: edit_prompt
dtype: string
- name: output_prompt
dtype: string
- name: designed_image
dtype: image
splits:
- name: train
num_bytes: 1464594066.747
num_examples: 4259
download_size: 1453222385
dataset_size: 1464594066.747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
weijie210/UFB_prefs_iter_0_4_bestn | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: critique
dtype: 'null'
- name: post_score
dtype: float64
- name: pre_score
dtype: float64
- name: score_diff
dtype: float64
- name: subsitute
dtype: bool
splits:
- name: train_prefs
num_bytes: 181851409
num_examples: 53693
- name: test_prefs
num_bytes: 6011581
num_examples: 1786
download_size: 95477128
dataset_size: 187862990
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
kastan/rlhf-qa-conditional-generation-v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 34403.31192660551
num_examples: 87
- name: valid
num_bytes: 8699.688073394496
num_examples: 22
download_size: 31360
dataset_size: 43103.0
---
# Dataset Card for "rlhf-qa-conditional-generation-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lilac_netjuunosusume | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lilac (Net-juu No Susume)
This is the dataset of Lilac (Net-juu No Susume), containing 172 images and their tags.
The core tags of this character are `pointy_ears, animal_ears, purple_hair, cat_ears, short_hair, purple_eyes, extra_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 172 | 93.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilac_netjuunosusume/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 172 | 93.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilac_netjuunosusume/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 316 | 158.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lilac_netjuunosusume/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lilac_netjuunosusume',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, fangs, open_mouth, smile, blush, bead_necklace, closed_eyes, solo |
| 1 | 8 |  |  |  |  |  | 1girl, dress, neck_bell, butterfly, jingle_bell, choker, open_mouth, brown_jacket, collar, looking_at_viewer, :d, bracelet, multiple_boys, solo_focus |
| 2 | 8 |  |  |  |  |  | 1girl, butterfly, choker, jingle_bell, neck_bell, solo, blush, anime_coloring, collarbone, open_mouth, jacket |
| 3 | 13 |  |  |  |  |  | 1girl, zettai_ryouiki, black_thighhighs, solo, beads, red_skirt, smile, cape, necklace, long_sleeves, shirt, boots, cloak |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fangs | open_mouth | smile | blush | bead_necklace | closed_eyes | solo | dress | neck_bell | butterfly | jingle_bell | choker | brown_jacket | collar | looking_at_viewer | :d | bracelet | multiple_boys | solo_focus | anime_coloring | collarbone | jacket | zettai_ryouiki | black_thighhighs | beads | red_skirt | cape | necklace | long_sleeves | shirt | boots | cloak |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:--------|:--------|:----------------|:--------------|:-------|:--------|:------------|:------------|:--------------|:---------|:---------------|:---------|:--------------------|:-----|:-----------|:----------------|:-------------|:-----------------|:-------------|:---------|:-----------------|:-------------------|:--------|:------------|:-------|:-----------|:---------------|:--------|:--------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | X | | | X | | X | X | X | X | | | | | | | | X | X | X | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B | ---
pretty_name: Evaluation run of togethercomputer/GPT-JT-Moderation-6B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/GPT-JT-Moderation-6B](https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T22:16:11.352297](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B/blob/main/results_2023-10-15T22-16-11.352297.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004089765100671141,\n\
\ \"em_stderr\": 0.0006535802669912847,\n \"f1\": 0.041537332214765195,\n\
\ \"f1_stderr\": 0.0012446539419451222,\n \"acc\": 0.3182665708457473,\n\
\ \"acc_stderr\": 0.008157539670038592\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912847,\n\
\ \"f1\": 0.041537332214765195,\n \"f1_stderr\": 0.0012446539419451222\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \
\ \"acc_stderr\": 0.0027210765770416634\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6266771902131019,\n \"acc_stderr\": 0.013594002763035523\n\
\ }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T22_16_11.352297
path:
- '**/details_harness|drop|3_2023-10-15T22-16-11.352297.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T22-16-11.352297.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T22_16_11.352297
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-16-11.352297.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-16-11.352297.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T22_16_11.352297
path:
- '**/details_harness|winogrande|5_2023-10-15T22-16-11.352297.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T22-16-11.352297.parquet'
- config_name: results
data_files:
- split: 2023_10_15T22_16_11.352297
path:
- results_2023-10-15T22-16-11.352297.parquet
- split: latest
path:
- results_2023-10-15T22-16-11.352297.parquet
---
# Dataset Card for Evaluation run of togethercomputer/GPT-JT-Moderation-6B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/GPT-JT-Moderation-6B](https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:16:11.352297](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B/blob/main/results_2023-10-15T22-16-11.352297.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912847,
"f1": 0.041537332214765195,
"f1_stderr": 0.0012446539419451222,
"acc": 0.3182665708457473,
"acc_stderr": 0.008157539670038592
},
"harness|drop|3": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912847,
"f1": 0.041537332214765195,
"f1_stderr": 0.0012446539419451222
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416634
},
"harness|winogrande|5": {
"acc": 0.6266771902131019,
"acc_stderr": 0.013594002763035523
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hsultanbey/javascript | ---
dataset_info:
features:
- name: code
dtype: string
splits:
- name: train
num_bytes: 863518025
num_examples: 99999
download_size: 308377342
dataset_size: 863518025
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "javascript"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Brizape/SETH_split_0404 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: texts
dtype: string
splits:
- name: test
num_bytes: 582671
num_examples: 126
- name: train
num_bytes: 2425278
num_examples: 504
download_size: 837941
dataset_size: 3007949
---
# Dataset Card for "SETH_split_0404"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/CSIC_GPTNEO_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307621182
num_examples: 37500
- name: test
num_bytes: 102540387
num_examples: 12500
download_size: 141533040
dataset_size: 410161569
---
# Dataset Card for "CSIC_GPTNEO_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/paq_gooaq_pairs_cf | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: sim
dtype: float32
splits:
- name: train
num_bytes: 6088226564.960985
num_examples: 9617549
download_size: 4076519230
dataset_size: 6088226564.960985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/MedQA_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 28990738
num_examples: 10178
- name: valid
num_bytes: 3622152
num_examples: 1272
- name: test
num_bytes: 3678270
num_examples: 1273
download_size: 14570611
dataset_size: 36291160
---
# Dataset Card for "MedQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
srikanthsri/Srimentalhealth | ---
license: openrail
---
|
loubnabnl/textbooks-filtering-600-samples | ---
configs:
- config_name: default
data_files:
- split: llama_70b_sample_prompt0
path: data/llama_70b_sample_prompt0-*
- split: llama_70b_greedy
path: data/llama_70b_greedy-*
- split: llama_70b_greedy_discrete
path: data/llama_70b_greedy_discrete-*
- split: llama_70b_greedy_no_conf
path: data/llama_70b_greedy_no_conf-*
- split: llama_70b_greedy_no_conf_noprefix
path: data/llama_70b_greedy_no_conf_noprefix-*
- split: llama_70b_meta
path: data/llama_70b_meta-*
- split: llama_70b_nometa
path: data/llama_70b_nometa-*
- split: llama_70b_meta_v2
path: data/llama_70b_meta_v2-*
- split: chatgpt
path: data/chatgpt-*
- split: gpt4
path: data/gpt4-*
dataset_info:
features:
- name: completion
dtype: string
- name: eval_prompt_header
dtype: string
- name: generation_config
struct:
- name: temperature
dtype: float64
- name: top_p
dtype: float64
- name: prompt
dtype: string
- name: review_model
dtype: string
- name: score
dtype: float64
splits:
- name: llama_70b_sample_prompt0
num_bytes: 2756529
num_examples: 600
- name: llama_70b_greedy
num_bytes: 3139908
num_examples: 600
- name: llama_70b_greedy_discrete
num_bytes: 3138291
num_examples: 600
- name: llama_70b_greedy_no_conf
num_bytes: 3359124
num_examples: 600
- name: llama_70b_greedy_no_conf_noprefix
num_bytes: 3461124
num_examples: 600
- name: llama_70b_meta
num_bytes: 3085159
num_examples: 600
- name: llama_70b_nometa
num_bytes: 3068954
num_examples: 600
- name: llama_70b_meta_v2
num_bytes: 3327190
num_examples: 600
- name: chatgpt
num_bytes: 2772298
num_examples: 600
- name: gpt4
num_bytes: 2800099
num_examples: 600
download_size: 1748097
dataset_size: 30908676
---
# Dataset Card for "textbooks-filtering-600-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/7290000_Groups_Chinese_Vietnamese_Parallel_Corpus_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
7.29 Million Pairs of Sentences - Chinese-Vietnamese Parallel Corpus Data be stored in text format. It covers multiple fields such as tourism, medical treatment, daily life, news, etc. The data desensitization and quality checking had been done. It can be used as a basic corpus for text data analysis in fields such as machine translation.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1170?source=Huggingface
## Storage format
TXT
## Data content
Chinese-Vietnamese Parallel Corpus Data
## Data size
7.29 million pairs of Chinese-Vietnamese Parallel Corpus Data
## Language
Chinese,Vietnamese
## Application scenario
machine translation
## Accuracy rate
90%
# Licensing Information
Commercial License
|
james-burton/text-exp-qa-hard | ---
dataset_info:
features:
- name: predicted_class
dtype: string
- name: classes_dict
dtype: string
- name: feature_nums
sequence: string
- name: sign
sequence: string
- name: values
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: id
dtype: int32
- name: question_id
dtype: int32
splits:
- name: train
num_bytes: 13000973
num_examples: 27000
- name: validation
num_bytes: 1445534
num_examples: 3000
- name: test
num_bytes: 297588
num_examples: 469
download_size: 1800431
dataset_size: 14744095
---
# Dataset Card for "text-exp-qa-hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/efatale | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Ef - A Tale
This is the image base of bangumi Ef - A Tale, we detected 21 characters, 2004 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 101 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 52 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 127 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 252 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 157 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 193 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 53 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 81 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 21 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 15 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 323 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 103 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 11 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 24 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 10 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 24 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 76 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 44 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 155 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 27 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 155 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_AA051610__A0120 | ---
pretty_name: Evaluation run of AA051610/A0120
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/A0120](https://huggingface.co/AA051610/A0120) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A0120\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T21:18:32.527803](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0120/blob/main/results_2024-01-19T21-18-32.527803.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7417714462373405,\n\
\ \"acc_stderr\": 0.028978132404413697,\n \"acc_norm\": 0.7454834976249006,\n\
\ \"acc_norm_stderr\": 0.029530861672701678,\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5748062315132791,\n\
\ \"mc2_stderr\": 0.015431295873654757\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n\
\ \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n\
\ \"acc_stderr\": 0.004723266971563396,\n \"acc_norm\": 0.8515236008763195,\n\
\ \"acc_norm_stderr\": 0.0035484490542860105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n\
\ \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6640211640211641,\n \"acc_stderr\": 0.02432631052914914,\n \"\
acc_norm\": 0.6640211640211641,\n \"acc_norm_stderr\": 0.02432631052914914\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8709677419354839,\n\
\ \"acc_stderr\": 0.019070889254792747,\n \"acc_norm\": 0.8709677419354839,\n\
\ \"acc_norm_stderr\": 0.019070889254792747\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5763546798029556,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.5763546798029556,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781657,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781657\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637303,\n\
\ \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637303\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944216,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944216\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230435,\n \"\
acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552097,\n \"\
acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552097\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758535,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758535\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275896,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275896\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n\
\ \"acc_stderr\": 0.010397417087292847,\n \"acc_norm\": 0.9067688378033205,\n\
\ \"acc_norm_stderr\": 0.010397417087292847\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252555,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6770949720670391,\n\
\ \"acc_stderr\": 0.015638440380241474,\n \"acc_norm\": 0.6770949720670391,\n\
\ \"acc_norm_stderr\": 0.015638440380241474\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n\
\ \"acc_stderr\": 0.022122439772480774,\n \"acc_norm\": 0.8135048231511254,\n\
\ \"acc_norm_stderr\": 0.022122439772480774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.021473491834808355,\n\
\ \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.021473491834808355\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6028368794326241,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.6028368794326241,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248345,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494757,\n\
\ \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494757\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7777777777777778,\n \"acc_stderr\": 0.016819028375736383,\n \
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.016819028375736383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n\
\ \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5748062315132791,\n\
\ \"mc2_stderr\": 0.015431295873654757\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.01094187795567621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \
\ \"acc_stderr\": 0.012942375603679368\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/A0120
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|arc:challenge|25_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|gsm8k|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hellaswag|10_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T21-18-32.527803.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- '**/details_harness|winogrande|5_2024-01-19T21-18-32.527803.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T21-18-32.527803.parquet'
- config_name: results
data_files:
- split: 2024_01_19T21_18_32.527803
path:
- results_2024-01-19T21-18-32.527803.parquet
- split: latest
path:
- results_2024-01-19T21-18-32.527803.parquet
---
# Dataset Card for Evaluation run of AA051610/A0120
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A0120](https://huggingface.co/AA051610/A0120) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A0120",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T21:18:32.527803](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0120/blob/main/results_2024-01-19T21-18-32.527803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7417714462373405,
"acc_stderr": 0.028978132404413697,
"acc_norm": 0.7454834976249006,
"acc_norm_stderr": 0.029530861672701678,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5748062315132791,
"mc2_stderr": 0.015431295873654757
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6612228639713205,
"acc_stderr": 0.004723266971563396,
"acc_norm": 0.8515236008763195,
"acc_norm_stderr": 0.0035484490542860105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6640211640211641,
"acc_stderr": 0.02432631052914914,
"acc_norm": 0.6640211640211641,
"acc_norm_stderr": 0.02432631052914914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8709677419354839,
"acc_stderr": 0.019070889254792747,
"acc_norm": 0.8709677419354839,
"acc_norm_stderr": 0.019070889254792747
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5763546798029556,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.5763546798029556,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781657,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781657
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476442,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637303,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637303
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944216,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944216
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230435,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552097,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552097
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758535,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758535
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275896,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275896
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.031766839486404054,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.031766839486404054
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292847,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292847
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252555,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6770949720670391,
"acc_stderr": 0.015638440380241474,
"acc_norm": 0.6770949720670391,
"acc_norm_stderr": 0.015638440380241474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480774,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.021473491834808355,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.021473491834808355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6028368794326241,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.6028368794326241,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248345,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494757,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494757
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.016819028375736383,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.016819028375736383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355027,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5748062315132791,
"mc2_stderr": 0.015431295873654757
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.01094187795567621
},
"harness|gsm8k|5": {
"acc": 0.6709628506444276,
"acc_stderr": 0.012942375603679368
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mstz/yeast | ---
language:
- en
tags:
- yeast
- tabular_classification
- binary_classification
- multiclass_classification
- UCI
pretty_name: Yeast
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- yeast
- yeast_0
- yeast_1
- yeast_2
- yeast_3
- yeast_4
- yeast_5
- yeast_6
- yeast_7
- yeast_8
- yeast_9
license: cc
---
# Yeast
The [Yeast dataset](https://archive-beta.ics.uci.edu/dataset/110/yeast) from the [UCI repository](https://archive-beta.ics.uci.edu/).
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/yeast")["train"]
```
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-----------------------|---------------------------|-------------------------|
| yeast | Multiclass classification.| |
| yeast_0 | Binary classification. | Is the instance of class 0? |
| yeast_1 | Binary classification. | Is the instance of class 1? |
| yeast_2 | Binary classification. | Is the instance of class 2? |
| yeast_3 | Binary classification. | Is the instance of class 3? |
| yeast_4 | Binary classification. | Is the instance of class 4? |
| yeast_5 | Binary classification. | Is the instance of class 5? |
| yeast_6 | Binary classification. | Is the instance of class 6? |
| yeast_7 | Binary classification. | Is the instance of class 7? |
| yeast_8 | Binary classification. | Is the instance of class 8? |
| yeast_9 | Binary classification. | Is the instance of class 9? | |
Multimodal-Fatima/Food101_10samples_class_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple pie
'1': baby back ribs
'2': baklava
'3': beef carpaccio
'4': beef tartare
'5': beet salad
'6': beignets
'7': bibimbap
'8': bread pudding
'9': breakfast burrito
'10': bruschetta
'11': caesar salad
'12': cannoli
'13': caprese salad
'14': carrot cake
'15': ceviche
'16': cheesecake
'17': cheese plate
'18': chicken curry
'19': chicken quesadilla
'20': chicken wings
'21': chocolate cake
'22': chocolate mousse
'23': churros
'24': clam chowder
'25': club sandwich
'26': crab cakes
'27': creme brulee
'28': croque madame
'29': cup cakes
'30': deviled eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs benedict
'35': escargots
'36': falafel
'37': filet mignon
'38': fish and chips
'39': foie gras
'40': french fries
'41': french onion soup
'42': french toast
'43': fried calamari
'44': fried rice
'45': frozen yogurt
'46': garlic bread
'47': gnocchi
'48': greek salad
'49': grilled cheese sandwich
'50': grilled salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot and sour soup
'55': hot dog
'56': huevos rancheros
'57': hummus
'58': ice cream
'59': lasagna
'60': lobster bisque
'61': lobster roll sandwich
'62': macaroni and cheese
'63': macarons
'64': miso soup
'65': mussels
'66': nachos
'67': omelette
'68': onion rings
'69': oysters
'70': pad thai
'71': paella
'72': pancakes
'73': panna cotta
'74': peking duck
'75': pho
'76': pizza
'77': pork chop
'78': poutine
'79': prime rib
'80': pulled pork sandwich
'81': ramen
'82': ravioli
'83': red velvet cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed salad
'89': shrimp and grits
'90': spaghetti bolognese
'91': spaghetti carbonara
'92': spring rolls
'93': steak
'94': strawberry shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna tartare
'100': waffles
- name: Attributes_ViT_L_14_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_food101
sequence: string
- name: clip_tags_ViT_L_14_with_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_wo_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_16_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_32_simple_specific
dtype: string
- name: clip_tags_ViT_B_32_ensemble_specific
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_ViT_B_16_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: id
dtype: int64
splits:
- name: test
num_bytes: 52378377.61
num_examples: 1010
download_size: 50351451
dataset_size: 52378377.61
---
# Dataset Card for "Food101_10samples_class_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_5046 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 49493860
num_examples: 5046
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 49672229
num_examples: 5046
download_size: 16604017
dataset_size: 99166089
---
# Dataset Card for "OK-VQA_test_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_5046"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sankovic/joaopaulogkdatasettt | ---
license: openrail
---
|
aagoluoglu/AI_HW3_object_frames | ---
dataset_info:
features:
- name: detected_obj_id
dtype: int64
- name: cropped_frame_encoded_base64
dtype: string
splits:
- name: train
num_bytes: 324924360
num_examples: 1111
download_size: 221658160
dataset_size: 324924360
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
glucose | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-ROC-stories
task_categories:
- fill-mask
- text-generation
paperswithcode_id: glucose
pretty_name: GLUCOSE
tags:
- commonsense-inference
dataset_info:
features:
- name: experiment_id
dtype: string
- name: story_id
dtype: string
- name: worker_id
dtype: int64
- name: worker_ids
dtype: string
- name: submission_time_normalized
dtype: string
- name: worker_quality_assessment
dtype: int64
- name: selected_sentence_index
dtype: int64
- name: story
dtype: string
- name: selected_sentence
dtype: string
- name: number_filled_in
dtype: int64
- name: 1_specificNL
dtype: string
- name: 1_specificStructured
dtype: string
- name: 1_generalNL
dtype: string
- name: 1_generalStructured
dtype: string
- name: 2_specificNL
dtype: string
- name: 2_specificStructured
dtype: string
- name: 2_generalNL
dtype: string
- name: 2_generalStructured
dtype: string
- name: 3_specificNL
dtype: string
- name: 3_specificStructured
dtype: string
- name: 3_generalNL
dtype: string
- name: 3_generalStructured
dtype: string
- name: 4_specificNL
dtype: string
- name: 4_specificStructured
dtype: string
- name: 4_generalNL
dtype: string
- name: 4_generalStructured
dtype: string
- name: 5_specificNL
dtype: string
- name: 5_specificStructured
dtype: string
- name: 5_generalNL
dtype: string
- name: 5_generalStructured
dtype: string
- name: 6_specificNL
dtype: string
- name: 6_specificStructured
dtype: string
- name: 6_generalNL
dtype: string
- name: 6_generalStructured
dtype: string
- name: 7_specificNL
dtype: string
- name: 7_specificStructured
dtype: string
- name: 7_generalNL
dtype: string
- name: 7_generalStructured
dtype: string
- name: 8_specificNL
dtype: string
- name: 8_specificStructured
dtype: string
- name: 8_generalNL
dtype: string
- name: 8_generalStructured
dtype: string
- name: 9_specificNL
dtype: string
- name: 9_specificStructured
dtype: string
- name: 9_generalNL
dtype: string
- name: 9_generalStructured
dtype: string
- name: 10_specificNL
dtype: string
- name: 10_specificStructured
dtype: string
- name: 10_generalNL
dtype: string
- name: 10_generalStructured
dtype: string
config_name: glucose
splits:
- name: train
num_bytes: 204605370
num_examples: 65522
- name: test
num_bytes: 355757
num_examples: 500
download_size: 30362105
dataset_size: 204961127
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Repository](https://github.com/TevenLeScao/glucose)**
- **[Paper](https://arxiv.org/abs/2009.07758)**
- **Point of Contact:** [glucose@elementalcognition.com](mailto:glucose@elementalcognition.com)
### Dataset Summary
GLUCOSE: GeneraLized and COntextualized Story Explanations, is a novel conceptual framework and dataset for commonsense reasoning. Given a short story and a sentence X in the story, GLUCOSE captures ten dimensions of causal explanation related to X. These dimensions, inspired by human cognitive psychology, cover often-implicit causes and effects of X, including events, location, possession, and other attributes.
### Supported Tasks and Leaderboards
Common sense inference of:
1. Causes
2. Emotions motivating an event
3. Locations enabling an event
4. Possession states enabling an event
5. Other attributes enabling an event
6. Consequences
7. Emotions caused by an event
8. Changes in location caused by an event
9. Changes in possession caused by an event
10. Other attributes that may be changed by an event
### Languages
English, monolingual
## Dataset Structure
### Data Instances
```
{
"experiment_id": "e56c7c3e-4660-40fb-80d0-052d566d676a__4",
"story_id": "e56c7c3e-4660-40fb-80d0-052d566d676a",
"worker_id": 19,
"submission_time_normalized": "20190930",
"worker_quality_rating": 3,
"selected_sentence_index": 4,
"story": "It was bedtime at our house. Two of the three kids hit the pillow and fall asleep. The third is a trouble maker. For two hours he continues to get out of bed and want to play. Finally he becomes tired and falls asleep."
selected_sentence: "Finally he becomes tired and falls asleep.",
"1_specificNL": "The third kid continues to get out of bed and wants to play >Causes/Enables> The kid finally becomes tired and falls asleep",
"1_specificStructured": "{The third kid}_[subject] {continues}_[verb] {to }_[preposition1] {get out of bed}_[object1] {and wants to play}_[object2] >Causes/Enables> {The kid}_[subject] {finally becomes}_[verb] {tired}_[object1] {and falls asleep}_[object2]",
"1_generalNL": "Someone_A doesn't want to go to sleep >Causes/Enables> Someone_A finally falls asleep",
"1_generalStructured": "{Someone_A}_[subject] {doesn't want}_[verb] {to }_[preposition1] {go to sleep}_[object1] >Causes/Enables> {Someone_A}_[subject] {finally falls}_[verb] {asleep}_[object1]",
"2_specificNL": "escaped",
"2_specificStructured": "escaped",
"2_generalNL": "escaped",
"2_generalStructured": "escaped",
"3_specificNL": "The third kid is in bed >Enables> The kid finally becomes tired and falls asleep",
"3_specificStructured": "{The third kid}_[subject] {is}_[verb] {in}_[preposition] {bed}_[object] >Enables> {The kid}_[subject] {finally becomes}_[verb] {tired}_[object1] {and falls asleep}_[object2]",
"3_generalNL": "Someone_A is in bed >Enables> Someone_A falls asleep",
"3_generalStructured": "{Someone_A}_[subject] {is}_[verb] {in}_[preposition] {bed}_[object] >Enables> {Someone_A}_[subject] {falls}_[verb] {asleep}_[object1]",
"4_specificNL": "escaped",
"4_specificStructured": "escaped",
"4_generalNL": "escaped",
"4_generalStructured": "escaped",
"5_specificNL": "escaped",
"5_specificStructured": "escaped",
"5_generalNL": "escaped",
"5_generalStructured": "escaped",
"6_specificNL": "escaped",
"6_specificStructured": "escaped",
"6_generalNL": "escaped",
"6_generalStructured": "escaped",
"7_specificNL": "escaped",
"7_specificStructured": "escaped",
"7_generalNL": "escaped",
"7_generalStructured": "escaped",
"8_specificNL": "escaped",
"8_specificStructured": "escaped",
"8_generalNL": "escaped",
"8_generalStructured": "escaped",
"9_specificNL": "escaped",
"9_specificStructured": "escaped",
"9_generalNL": "escaped",
"9_generalStructured": "escaped",
"10_specificNL": "escaped",
"10_specificStructured": "escaped",
"10_generalNL": "escaped",
"10_generalStructured": "escaped",
"number_filled_in": 7
}
```
### Data Fields
- __experiment_id__: a randomly generated alphanumeric sequence for a given story with the sentence index appended at the end after two underscores. Example: cbee2b5a-f2f9-4bca-9630-6825b1e36c13__0
- __story_id__: a random alphanumeric identifier for the story. Example: e56c7c3e-4660-40fb-80d0-052d566d676a
- __worker_id__: each worker has a unique identificaiton number. Example: 21
- __submission_time_normalized__: the time of submission in the format YYYYMMDD. Example: 20200115
- __worker_quality_assessment__: rating for the worker on the assignment in the row. Example: 2
- __selected_sentence_index__: the index of a given sentence in a story. Example: 0
- __story__: contains the full text of the ROC story that was used for the HIT. Example: It was bedtime at our house. Two of the three kids hit the pillow and fall asleep. The third is a trouble maker. For two hours he continues to get out of bed and want to play. Finally he becomes tired and falls asleep.
- __selected_sentence__: the sentence from the story that is being annotated. Example: It was bedtime at our house.
- __[1-10]\_[specific/general][NL/Structured]__: This is the primary data collected. It provides the common sense knowledge about the related stories and those general rules about the world derived from the specific statements. For each of the ten relationships, there are four columns. The specific columns give the specific statements from the story. The general statements give the corresponding generalization. The NL columns are formatted in natural language, whereas the structured columns contain indications of the slots used to fill in the data. Example:
- __1_specificNL__: "The school has a football team >Causes/Enables> The football game was last weekend"
- __1_specificStructured__: "{The school }\_[subject] {has }\_[verb] {a football team }\_[object1] >Causes/Enables> {The football game }\_[subject] {was last weekend }\_[verb]"
- __1_generalNL__: "Somewhere_A (that is a school ) has Something_A (that is a sports team ) >Causes/Enables> The game was last weekend"
- __1_generalStructured__: "{Somewhere_A ||that is a school ||}\_[subject] {has }\_[verb] {Something_A ||that is a sports team ||}\_[object1] >Causes/Enables> {The game }\_[subject] {was last weekend }\_[verb]"
- __number\_filled\_in__: number of dimensions filled in for the assignment. Example: 4
### Data Splits
Train split: 65,521 examples
Test splits: 500 examples, without worker id and rating, number filled in, and structured text.
## Dataset Creation
### Curation Rationale
When humans read or listen, they make implicit commonsense inferences that frame their understanding of what happened and why. As a step toward AI systems that can build similar mental models, we introduce GLUCOSE, a large-scale dataset of implicit commonsense causal knowledge, encoded as causal mini-theories about the world, each grounded in a narrative context.
### Source Data
#### Initial Data Collection and Normalization
Initial text from ROCStories
#### Who are the source language producers?
Amazon Mechanical Turk.
### Annotations
#### Annotation process
To enable developing models that can build mental models of narratives, we aimed to crowdsource a large, quality-monitored dataset. Beyond the scalability benefits, using crowd workers (as opposed to a small set of expert annotators) ensures diversity of thought, thus broadening coverage of a common-sense knowledge resource. The annotation task is complex: it requires annotators to understand different causal dimensions in a variety of contexts and to come up with generalized theories beyond the story context. For
strict quality control, we designed a three-stage knowledge acquisition pipeline for crowdsourcing the GLUCOSE dataset on the Amazon Mechanical Turk Platform. The workers first go through a qualification test where they must score at least 90% on 10 multiple-choice questions on select GLUCOSE dimensions. Next, qualified workers can work on the main GLUCOSE data collection task: given a story S and a story sentence X, they are asked to fill in (allowing for non-applicable) all ten GLUCOSE dimensions, getting step-by-step guidance from the GLUCOSE data acquisition. To ensure data consistency, the same workers answer all dimensions for an S, X pair. Finally, the submissions are reviewed by an expert who rates each worker on a scale from 0 to 3, and provides feedback on how to improve. Our final UIs are the result of more than six rounds of pilot studies, iteratively improving the interaction elements, functionality, dimension definitions, instructions, and examples.
#### Who are the annotators?
Amazon Mechanical Turk workers, with feedback from an expert.
### Personal and Sensitive Information
No personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Nasrin Mostafazadeh, Aditya Kalyanpur, Lori Moon, David Buchanan, Lauren Berkowitz, Or Biran, Jennifer Chu-Carroll, from Elemental Cognition
### Licensing Information
Creative Commons Attribution-NonCommercial 4.0 International Public License
### Citation Information
```
@inproceedings{mostafazadeh2020glucose,
title={GLUCOSE: GeneraLized and COntextualized Story Explanations},
author={Nasrin Mostafazadeh and Aditya Kalyanpur and Lori Moon and David Buchanan and Lauren Berkowitz and Or Biran and Jennifer Chu-Carroll},
year={2020},
booktitle={The Conference on Empirical Methods in Natural Language Processing},
publisher={Association for Computational Linguistics}
}
```
### Contributions
Thanks to [@TevenLeScao](https://github.com/TevenLeScao) for adding this dataset. |
liuyanchen1015/MULTI_VALUE_mrpc_aint_be | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 14577
num_examples: 59
- name: train
num_bytes: 33458
num_examples: 127
- name: validation
num_bytes: 4482
num_examples: 18
download_size: 45197
dataset_size: 52517
---
# Dataset Card for "MULTI_VALUE_mrpc_aint_be"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/newsquadfr_fr_prompt_context_generation_with_answer_and_question | ---
language:
- fr
license: cc-by-nc-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- newsquadfr
---
# newsquadfr_fr_prompt_context_generation_with_answer_and_question
## Summary
**newsquadfr_fr_prompt_context_generation_with_answer_and_question** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **101,040** rows that can be used for a context-generation (with answer)task.
The original data (without prompts) comes from the dataset [newsquadfr](https://huggingface.co/datasets/lincoln/newsquadfr) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Déterminer la question qui aurait pu être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Détermine la question que tu aurais pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Déterminez la question que vous auriez pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question aurait pu être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question aurais-tu pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question auriez-vous pu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question peut être posée pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question peux-tu poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Quelle question pouvez-vous poser pour obtenir la réponse suivante dans le contexte donné. \n Contexte : "'+context+'";\n Réponse : "'+answer+'";\n Question :',
'Sachant la réponse suivante : "'+answer+'"\n Générer une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Génère une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Générez une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouves une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Trouvez une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Créer une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Crée trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Créez trouver une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecrire une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecris une bonne question pour le texte suivant : "'+context+'"',
'Sachant la réponse suivante : "'+answer+'"\n Ecrivez une bonne question pour le texte suivant : "'+context+'"'
```
# Splits
- `train` with 79,200 samples
- `valid` with 21,800 samples
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/newsquadfr_fr_prompt_context_generation_with_answer_and_question")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/lincoln/newsquadfr
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-NC-SA 4.0 |
open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora | ---
pretty_name: Evaluation run of amu/r-zephyr-7b-beta-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amu/r-zephyr-7b-beta-qlora](https://huggingface.co/amu/r-zephyr-7b-beta-qlora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T15:14:16.944121](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora/blob/main/results_2024-02-18T15-14-16.944121.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6296850251136631,\n\
\ \"acc_stderr\": 0.03258075814279175,\n \"acc_norm\": 0.6347979739934548,\n\
\ \"acc_norm_stderr\": 0.033240416199190104,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.46320599364707243,\n\
\ \"mc2_stderr\": 0.015074191577064598\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719867,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6594303923521211,\n\
\ \"acc_stderr\": 0.004729322613301549,\n \"acc_norm\": 0.8538139812786297,\n\
\ \"acc_norm_stderr\": 0.0035257057733534187\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936087,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936087\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526501,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526501\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.0160182397105134,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.0160182397105134\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059608,\n \"mc2\": 0.46320599364707243,\n\
\ \"mc2_stderr\": 0.015074191577064598\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3904473085670963,\n \
\ \"acc_stderr\": 0.013437829864668578\n }\n}\n```"
repo_url: https://huggingface.co/amu/r-zephyr-7b-beta-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-14-16.944121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T15-14-16.944121.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- '**/details_harness|winogrande|5_2024-02-18T15-14-16.944121.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T15-14-16.944121.parquet'
- config_name: results
data_files:
- split: 2024_02_18T15_14_16.944121
path:
- results_2024-02-18T15-14-16.944121.parquet
- split: latest
path:
- results_2024-02-18T15-14-16.944121.parquet
---
# Dataset Card for Evaluation run of amu/r-zephyr-7b-beta-qlora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [amu/r-zephyr-7b-beta-qlora](https://huggingface.co/amu/r-zephyr-7b-beta-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T15:14:16.944121](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora/blob/main/results_2024-02-18T15-14-16.944121.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6296850251136631,
"acc_stderr": 0.03258075814279175,
"acc_norm": 0.6347979739934548,
"acc_norm_stderr": 0.033240416199190104,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.46320599364707243,
"mc2_stderr": 0.015074191577064598
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719867,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491887
},
"harness|hellaswag|10": {
"acc": 0.6594303923521211,
"acc_stderr": 0.004729322613301549,
"acc_norm": 0.8538139812786297,
"acc_norm_stderr": 0.0035257057733534187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924006,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936087,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936087
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526501,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526501
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.0160182397105134,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.0160182397105134
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059608,
"mc2": 0.46320599364707243,
"mc2_stderr": 0.015074191577064598
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235802
},
"harness|gsm8k|5": {
"acc": 0.3904473085670963,
"acc_stderr": 0.013437829864668578
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
asrtre/sdfwaqs | ---
license: apache-2.0
---
|
h4rr9/mnist_palette_num_9_bit | ---
dataset_info:
features:
- name: captions
dtype: string
- name: palette_images
dtype: string
splits:
- name: validation
num_bytes: 51290000
num_examples: 10000
- name: train
num_bytes: 307740000
num_examples: 60000
download_size: 41085975
dataset_size: 359030000
---
# Dataset Card for "mnist_palette_9_bit_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sanaeai/CoNaLa | ---
dataset_info:
features:
- name: intent
dtype: string
- name: rewritten_intent
dtype: string
- name: snippet
dtype: string
- name: question_id
dtype: int64
splits:
- name: train
num_bytes: 475799
num_examples: 2879
download_size: 259975
dataset_size: 475799
---
# Dataset Card for "CoNaLa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_serial_verb_give | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 315
num_examples: 2
- name: train
num_bytes: 444
num_examples: 4
download_size: 5339
dataset_size: 759
---
# Dataset Card for "MULTI_VALUE_wnli_serial_verb_give"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rasu23/iapp_all_mep_iter0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 33113417
num_examples: 5761
download_size: 5629807
dataset_size: 33113417
---
# Dataset Card for "iapp_all_mep_iter0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fuyu-quant/ibl-regression-ver1-branch | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: index
dtype: int64
- name: category
dtype: string
splits:
- name: train
num_bytes: 42612428
num_examples: 30000
- name: test
num_bytes: 1419385
num_examples: 1000
download_size: 20886073
dataset_size: 44031813
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.