id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
mboth/luftVersorgen-100-undersampled | 2023-09-22T05:59:02.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': LuftBereitstellen
'1': LuftVerteilen
splits:
- name: train
num_bytes: 39514.861205145564
num_examples: 200
- name: test
num_bytes: 290707
num_examples: 1477
- name: valid
num_bytes: 290707
num_examples: 1477
download_size: 234233
dataset_size: 620928.8612051455
---
# Dataset Card for "luftVersorgen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftVersorgen-200-undersampled | 2023-09-22T05:59:06.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': LuftBereitstellen
'1': LuftVerteilen
splits:
- name: train
num_bytes: 79029.72241029113
num_examples: 400
- name: test
num_bytes: 290707
num_examples: 1477
- name: valid
num_bytes: 290707
num_examples: 1477
download_size: 247001
dataset_size: 660443.7224102912
---
# Dataset Card for "luftVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deema/squad_v2_counterfactual | 2023-09-22T06:00:30.000Z | [
"region:us"
] | Deema | null | null | null | 0 | 0 | Squad V2 dataset Validation with counterfactual context |
cyrilzhang/wiki-bpe-48k | 2023-09-22T06:13:15.000Z | [
"region:us"
] | cyrilzhang | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 20505990100
num_examples: 5001461
- name: test
num_bytes: 206143900
num_examples: 50279
download_size: 9547305598
dataset_size: 20712134000
---
# Dataset Card for "wiki-bpe-48k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/medienVersorgen-50-undersampled | 2023-09-22T06:11:50.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Bereitstellen
'1': Entsorgen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 37075.44918032787
num_examples: 188
- name: test
num_bytes: 14725
num_examples: 77
- name: valid
num_bytes: 14725
num_examples: 77
download_size: 36084
dataset_size: 66525.44918032788
---
# Dataset Card for "medienVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/medienVersorgen-100-undersampled | 2023-09-22T06:11:54.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Bereitstellen
'1': Entsorgen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 59754.580327868855
num_examples: 303
- name: test
num_bytes: 14725
num_examples: 77
- name: valid
num_bytes: 14725
num_examples: 77
download_size: 42237
dataset_size: 89204.58032786885
---
# Dataset Card for "medienVersorgen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/medienVersorgen-200-undersampled | 2023-09-22T06:11:57.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Bereitstellen
'1': Entsorgen
'2': Speichern
'3': Verteilen
splits:
- name: train
num_bytes: 79475.56393442623
num_examples: 403
- name: test
num_bytes: 14725
num_examples: 77
- name: valid
num_bytes: 14725
num_examples: 77
download_size: 47115
dataset_size: 108925.56393442623
---
# Dataset Card for "medienVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/sichern-50-undersampled | 2023-09-22T06:22:57.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Brandmeldeanlage
'1': Brandschutzklappe
'2': Einbruchmeldeanlage
'3': Entrauchung-Ventilator
'4': Feuerlöschanlage
'5': Gaswarnanlage
'6': Notruf
'7': Rauchmeldeanlage
splits:
- name: train
num_bytes: 38006.082374966565
num_examples: 193
- name: test
num_bytes: 186480
num_examples: 935
- name: valid
num_bytes: 186480
num_examples: 935
download_size: 130269
dataset_size: 410966.0823749666
---
# Dataset Card for "sichern-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/sichern-100-undersampled | 2023-09-22T06:23:01.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Brandmeldeanlage
'1': Brandschutzklappe
'2': Einbruchmeldeanlage
'3': Entrauchung-Ventilator
'4': Feuerlöschanlage
'5': Gaswarnanlage
'6': Notruf
'7': Rauchmeldeanlage
splits:
- name: train
num_bytes: 66362.95212623697
num_examples: 337
- name: test
num_bytes: 186480
num_examples: 935
- name: valid
num_bytes: 186480
num_examples: 935
download_size: 138099
dataset_size: 439322.952126237
---
# Dataset Card for "sichern-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/sichern-200-undersampled | 2023-09-22T06:23:05.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Brandmeldeanlage
'1': Brandschutzklappe
'2': Einbruchmeldeanlage
'3': Entrauchung-Ventilator
'4': Feuerlöschanlage
'5': Gaswarnanlage
'6': Notruf
'7': Rauchmeldeanlage
splits:
- name: train
num_bytes: 105747.49344744584
num_examples: 537
- name: test
num_bytes: 186480
num_examples: 935
- name: valid
num_bytes: 186480
num_examples: 935
download_size: 148661
dataset_size: 478707.49344744586
---
# Dataset Card for "sichern-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cyrilzhang/wiki-bpe-64k | 2023-09-22T06:33:17.000Z | [
"region:us"
] | cyrilzhang | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 20157432700
num_examples: 4916447
- name: test
num_bytes: 202663000
num_examples: 49430
download_size: 8837145740
dataset_size: 20360095700
---
# Dataset Card for "wiki-bpe-64k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeErzeugen-50-undersampled | 2023-09-22T07:11:07.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': BHKW
'1': Kessel
'2': Pelletkessel
'3': Waermepumpe
'4': WaermeversorgerAllgemein
splits:
- name: train
num_bytes: 37366.89908256881
num_examples: 209
- name: test
num_bytes: 38880
num_examples: 218
- name: valid
num_bytes: 38880
num_examples: 218
download_size: 54745
dataset_size: 115126.89908256881
---
# Dataset Card for "waermeErzeugen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeErzeugen-100-undersampled | 2023-09-22T07:11:11.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': BHKW
'1': Kessel
'2': Pelletkessel
'3': Waermepumpe
'4': WaermeversorgerAllgemein
splits:
- name: train
num_bytes: 64185.247706422015
num_examples: 359
- name: test
num_bytes: 38880
num_examples: 218
- name: valid
num_bytes: 38880
num_examples: 218
download_size: 61981
dataset_size: 141945.247706422
---
# Dataset Card for "waermeErzeugen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeErzeugensichern-200-undersampled | 2023-09-22T07:11:14.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': BHKW
'1': Kessel
'2': Pelletkessel
'3': Waermepumpe
'4': WaermeversorgerAllgemein
splits:
- name: train
num_bytes: 117821.94495412844
num_examples: 659
- name: test
num_bytes: 38880
num_examples: 218
- name: valid
num_bytes: 38880
num_examples: 218
download_size: 76901
dataset_size: 195581.94495412844
---
# Dataset Card for "waermeErzeugensichern-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sai0720/sampleDemoSet | 2023-09-22T07:17:56.000Z | [
"license:unknown",
"region:us"
] | Sai0720 | null | null | null | 0 | 0 | ---
license: unknown
---
|
open-llm-leaderboard/details_willyninja30__ARIA-70B-French | 2023-09-22T07:24:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of willyninja30/ARIA-70B-French
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [willyninja30/ARIA-70B-French](https://huggingface.co/willyninja30/ARIA-70B-French)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_willyninja30__ARIA-70B-French\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T07:22:49.937285](https://huggingface.co/datasets/open-llm-leaderboard/details_willyninja30__ARIA-70B-French/blob/main/results_2023-09-22T07-22-49.937285.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6386983068475005,\n\
\ \"acc_stderr\": 0.032863621226889406,\n \"acc_norm\": 0.6425916297913504,\n\
\ \"acc_norm_stderr\": 0.03283781788418258,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.527991738544026,\n\
\ \"mc2_stderr\": 0.015530613367021443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.01428589829293817,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6690898227444733,\n\
\ \"acc_stderr\": 0.004695791340502876,\n \"acc_norm\": 0.8586934873531169,\n\
\ \"acc_norm_stderr\": 0.003476255509644533\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894442,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894442\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768783,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768783\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250458,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250458\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.02362715946031867,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.02362715946031867\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.02991858670779883,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.02991858670779883\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.039800662464677665,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.039800662464677665\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371798,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371798\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.01633726869427011,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.01633726869427011\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718968,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718968\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.01275911706651802,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.01275911706651802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.527991738544026,\n\
\ \"mc2_stderr\": 0.015530613367021443\n }\n}\n```"
repo_url: https://huggingface.co/willyninja30/ARIA-70B-French
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|arc:challenge|25_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hellaswag|10_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-22-49.937285.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T07-22-49.937285.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T07-22-49.937285.parquet'
- config_name: results
data_files:
- split: 2023_09_22T07_22_49.937285
path:
- results_2023-09-22T07-22-49.937285.parquet
- split: latest
path:
- results_2023-09-22T07-22-49.937285.parquet
---
# Dataset Card for Evaluation run of willyninja30/ARIA-70B-French
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/willyninja30/ARIA-70B-French
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [willyninja30/ARIA-70B-French](https://huggingface.co/willyninja30/ARIA-70B-French) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_willyninja30__ARIA-70B-French",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T07:22:49.937285](https://huggingface.co/datasets/open-llm-leaderboard/details_willyninja30__ARIA-70B-French/blob/main/results_2023-09-22T07-22-49.937285.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6386983068475005,
"acc_stderr": 0.032863621226889406,
"acc_norm": 0.6425916297913504,
"acc_norm_stderr": 0.03283781788418258,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.527991738544026,
"mc2_stderr": 0.015530613367021443
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.01428589829293817,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094087
},
"harness|hellaswag|10": {
"acc": 0.6690898227444733,
"acc_stderr": 0.004695791340502876,
"acc_norm": 0.8586934873531169,
"acc_norm_stderr": 0.003476255509644533
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894442,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894442
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768783,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768783
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163255,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.02362715946031867,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.02362715946031867
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.02991858670779883,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.02991858670779883
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.039800662464677665,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.039800662464677665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371798,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371798
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.01633726869427011,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.01633726869427011
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718968,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718968
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.01275911706651802,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.01275911706651802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.527991738544026,
"mc2_stderr": 0.015530613367021443
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_KnutJaegersberg__deacon-13b | 2023-09-22T07:25:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/deacon-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/deacon-13b](https://huggingface.co/KnutJaegersberg/deacon-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__deacon-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T07:24:15.341487](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-13b/blob/main/results_2023-09-22T07-24-15.341487.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5535148506815978,\n\
\ \"acc_stderr\": 0.03431923347411148,\n \"acc_norm\": 0.5576006232764058,\n\
\ \"acc_norm_stderr\": 0.03429909008639212,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.015866346401384315,\n \"mc2\": 0.3932644988188049,\n\
\ \"mc2_stderr\": 0.014623370899294079\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5443686006825939,\n \"acc_stderr\": 0.014553749939306864,\n\
\ \"acc_norm\": 0.5784982935153583,\n \"acc_norm_stderr\": 0.01443019706932602\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.619398526190002,\n\
\ \"acc_stderr\": 0.004845424524764037,\n \"acc_norm\": 0.8263294164509062,\n\
\ \"acc_norm_stderr\": 0.0037805175193024927\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724342,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724342\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534785,\n\
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534785\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.019069098363191442,\n \"\
acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.019069098363191442\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483713,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483713\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n\
\ \"acc_stderr\": 0.015517322365529641,\n \"acc_norm\": 0.7484035759897829,\n\
\ \"acc_norm_stderr\": 0.015517322365529641\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940975,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940975\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n\
\ \"acc_stderr\": 0.01259950560833647,\n \"acc_norm\": 0.41851368970013036,\n\
\ \"acc_norm_stderr\": 0.01259950560833647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030802,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030802\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826368,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826368\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.015866346401384315,\n \"mc2\": 0.3932644988188049,\n\
\ \"mc2_stderr\": 0.014623370899294079\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/deacon-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|arc:challenge|25_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hellaswag|10_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T07-24-15.341487.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T07-24-15.341487.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T07-24-15.341487.parquet'
- config_name: results
data_files:
- split: 2023_09_22T07_24_15.341487
path:
- results_2023-09-22T07-24-15.341487.parquet
- split: latest
path:
- results_2023-09-22T07-24-15.341487.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/deacon-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-13b](https://huggingface.co/KnutJaegersberg/deacon-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__deacon-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T07:24:15.341487](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-13b/blob/main/results_2023-09-22T07-24-15.341487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5535148506815978,
"acc_stderr": 0.03431923347411148,
"acc_norm": 0.5576006232764058,
"acc_norm_stderr": 0.03429909008639212,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.015866346401384315,
"mc2": 0.3932644988188049,
"mc2_stderr": 0.014623370899294079
},
"harness|arc:challenge|25": {
"acc": 0.5443686006825939,
"acc_stderr": 0.014553749939306864,
"acc_norm": 0.5784982935153583,
"acc_norm_stderr": 0.01443019706932602
},
"harness|hellaswag|10": {
"acc": 0.619398526190002,
"acc_stderr": 0.004845424524764037,
"acc_norm": 0.8263294164509062,
"acc_norm_stderr": 0.0037805175193024927
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364396,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364396
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724342,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724342
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534785,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.019069098363191442,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.019069098363191442
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483713,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483713
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529641,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529641
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940975,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940975
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.01259950560833647,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.01259950560833647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030802,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030802
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826368,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826368
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.015866346401384315,
"mc2": 0.3932644988188049,
"mc2_stderr": 0.014623370899294079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mboth/waermeVerteilen-50-undersampled | 2023-09-22T07:26:51.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Druckhaltestation
'1': HeizkreisAllgemein
'2': Heizkurve
'3': Kaeltemengenzaehler
'4': Pumpe
'5': Raum
'6': Regler
'7': Ruecklauf
'8': Uebertrager
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
'12': Warmwasserbereitung
splits:
- name: train
num_bytes: 114908.01213960546
num_examples: 540
- name: test
num_bytes: 423002
num_examples: 1978
- name: valid
num_bytes: 423002
num_examples: 1978
download_size: 319448
dataset_size: 960912.0121396055
---
# Dataset Card for "waermeVerteilen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeVerteilen-100-undersampled | 2023-09-22T07:26:58.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Druckhaltestation
'1': HeizkreisAllgemein
'2': Heizkurve
'3': Kaeltemengenzaehler
'4': Pumpe
'5': Raum
'6': Regler
'7': Ruecklauf
'8': Uebertrager
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
'12': Warmwasserbereitung
splits:
- name: train
num_bytes: 216197.29691451695
num_examples: 1016
- name: test
num_bytes: 423002
num_examples: 1978
- name: valid
num_bytes: 423002
num_examples: 1978
download_size: 353233
dataset_size: 1062201.296914517
---
# Dataset Card for "waermeVerteilen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/waermeVerteilen-200-undersampled | 2023-09-22T07:27:04.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Druckhaltestation
'1': HeizkreisAllgemein
'2': Heizkurve
'3': Kaeltemengenzaehler
'4': Pumpe
'5': Raum
'6': Regler
'7': Ruecklauf
'8': Uebertrager
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
'12': Warmwasserbereitung
splits:
- name: train
num_bytes: 407710.65048052603
num_examples: 1916
- name: test
num_bytes: 423002
num_examples: 1978
- name: valid
num_bytes: 423002
num_examples: 1978
download_size: 411048
dataset_size: 1253714.650480526
---
# Dataset Card for "waermeVerteilen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shitao/pair_data | 2023-09-22T07:39:17.000Z | [
"license:mit",
"region:us"
] | Shitao | null | null | null | 0 | 0 | ---
license: mit
---
|
egoslovos1/demo | 2023-09-22T07:44:53.000Z | [
"region:us"
] | egoslovos1 | null | null | null | 0 | 0 | Entry not found |
sdg416826/test | 2023-09-25T07:22:21.000Z | [
"region:us"
] | sdg416826 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1655208
num_examples: 1000
download_size: 966969
dataset_size: 1655208
---
|
uhhlt/amharichatespeechranlp | 2023-09-22T19:12:49.000Z | [
"task_categories:text-classification",
"language:amh",
"am",
"region:us"
] | uhhlt | null | null | null | 0 | 0 | ---
language:
- amh
pretty_name: "Amharic Hate Speech Dataset"
tags:
- am
task_categories:
- text-classification
---
[GitHub](https://github.com/uhh-lt/AmharicHateSpeech)
# Introduction
The Amharic Hate Speech data is collected using the Twitter API spanning from October 1, 2020 - November 30, 2022, considering the socio-political dynamics of Ethiopia in Twitter space. We used [WEbAnno](http://ltdemos.informatik.uni-hamburg.de/codebookanno-cba/) tool for data annotation; each tweet is annotated by two native speakers and curated by one more experienced adjudicator to determine the gold labels. A total of 15.1k tweets consisting of three class labels namely: Hate, Offensive and Normal are presented. Read our papers for more details about the dataset (see below).
# Amharic Hate Speech Data Annotation: Lab-Controlled Annotation
The dataset is annotated by two annotators and a curator to determine the gold labels.
For more details, You can read our paper entitled:
1. [Exploring Amharic Hate Speech data Collection and Classification Approaches](https://www.inf.uni-hamburg.de/en/inst/ab/lt/publications/2023-ayele-et-al-hate-ranlp.pdf)
|
mboth/luftVerteilen-50-undersampled | 2023-09-22T08:01:50.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Auslass
'1': Raum
'2': VolumenstromreglerAbluft
'3': VolumenstromreglerRaum
'4': VolumenstromreglerZuluft
- name: Score
dtype: float64
splits:
- name: train
num_bytes: 60732.34410511364
num_examples: 237
- name: test
num_bytes: 91259
num_examples: 352
- name: valid
num_bytes: 91259
num_examples: 352
download_size: 99040
dataset_size: 243250.34410511365
---
# Dataset Card for "luftVerteilen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftVerteilen-100-undersampled | 2023-09-22T08:01:55.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Auslass
'1': Raum
'2': VolumenstromreglerAbluft
'3': VolumenstromreglerRaum
'4': VolumenstromreglerZuluft
- name: Score
dtype: float64
splits:
- name: train
num_bytes: 103270.61044034091
num_examples: 403
- name: test
num_bytes: 91259
num_examples: 352
- name: valid
num_bytes: 91259
num_examples: 352
download_size: 111225
dataset_size: 285788.61044034094
---
# Dataset Card for "luftVerteilen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftVerteilen-200-undersampled | 2023-09-22T08:01:59.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Auslass
'1': Raum
'2': VolumenstromreglerAbluft
'3': VolumenstromreglerRaum
'4': VolumenstromreglerZuluft
- name: Score
dtype: float64
splits:
- name: train
num_bytes: 180146.99538352274
num_examples: 703
- name: test
num_bytes: 91259
num_examples: 352
- name: valid
num_bytes: 91259
num_examples: 352
download_size: 132465
dataset_size: 362664.9953835227
---
# Dataset Card for "luftVerteilen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftBereitstellen-50-undersampled | 2023-09-22T08:11:53.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': AbluftAllgemein
'1': Abluftfilter
'2': Abluftklappe
'3': Abluftventilator
'4': Außenluftfilter
'5': Außenluftklappe
'6': Befeuchter
'7': Erhitzer
'8': Filter
'9': Fortluftklappe
'10': GerätAllgemein
'11': Kaeltemengenzaehler
'12': KlappenAllgemein
'13': Kühler
'14': Regler
'15': Umluft
'16': Ventilator
'17': Wärmemengenzähler
'18': Wärmerückgewinnung
'19': ZuluftAllgemein
'20': Zuluftfilter
'21': Zuluftklappe
'22': Zuluftventilator
splits:
- name: train
num_bytes: 208830.91202313424
num_examples: 982
- name: test
num_bytes: 238179
num_examples: 1124
- name: valid
num_bytes: 238179
num_examples: 1124
download_size: 227690
dataset_size: 685188.9120231343
---
# Dataset Card for "luftBereitstellen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftBereitstellen-100-undersampled | 2023-09-22T08:11:59.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': AbluftAllgemein
'1': Abluftfilter
'2': Abluftklappe
'3': Abluftventilator
'4': Außenluftfilter
'5': Außenluftklappe
'6': Befeuchter
'7': Erhitzer
'8': Filter
'9': Fortluftklappe
'10': GerätAllgemein
'11': Kaeltemengenzaehler
'12': KlappenAllgemein
'13': Kühler
'14': Regler
'15': Umluft
'16': Ventilator
'17': Wärmemengenzähler
'18': Wärmerückgewinnung
'19': ZuluftAllgemein
'20': Zuluftfilter
'21': Zuluftklappe
'22': Zuluftventilator
splits:
- name: train
num_bytes: 378107.292848404
num_examples: 1778
- name: test
num_bytes: 238179
num_examples: 1124
- name: valid
num_bytes: 238179
num_examples: 1124
download_size: 280245
dataset_size: 854465.292848404
---
# Dataset Card for "luftBereitstellen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/luftBereitstellen-200-undersampled | 2023-09-22T08:12:04.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': AbluftAllgemein
'1': Abluftfilter
'2': Abluftklappe
'3': Abluftventilator
'4': Außenluftfilter
'5': Außenluftklappe
'6': Befeuchter
'7': Erhitzer
'8': Filter
'9': Fortluftklappe
'10': GerätAllgemein
'11': Kaeltemengenzaehler
'12': KlappenAllgemein
'13': Kühler
'14': Regler
'15': Umluft
'16': Ventilator
'17': Wärmemengenzähler
'18': Wärmerückgewinnung
'19': ZuluftAllgemein
'20': Zuluftfilter
'21': Zuluftklappe
'22': Zuluftventilator
splits:
- name: train
num_bytes: 594806.5793571349
num_examples: 2797
- name: test
num_bytes: 238179
num_examples: 1124
- name: valid
num_bytes: 238179
num_examples: 1124
download_size: 347666
dataset_size: 1071164.5793571349
---
# Dataset Card for "luftBereitstellen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ISCA-IUB/AntisemitismOnTwitter | 2023-09-22T08:39:09.000Z | [
"language:en",
"arxiv:2304.14599",
"region:us"
] | ISCA-IUB | null | null | null | 1 | 0 | ---
language:
- en
---
# Dataset Card for Dataset on Antisemitism on Twitter/X
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The ISCA project has compiled this dataset using an annotation portal, which was used to label tweets as either antisemitic or non-antisemitic, among other labels. Please note that the annotation was done with live data, including images and the context, such as threads. The original data was sourced from annotationportal.com.
### Languages
English
## Dataset Structure
‘TweetID’: Represents the tweet ID.
‘Username’: Represents the username who published the tweet.
‘Text’: Represents the full text of the tweet (not pre-processed).
‘CreateDate’: Represents the date the tweet was created.
‘Biased’: Represents the labeled by our annotations if the tweet is antisemitic or non-antisemitic.
‘Keyword’: Represents the keyword that was used in the query. The keyword can be in the text, including mentioned names, or the username.
## Dataset Creation
This dataset contains 6,941 tweets that cover a wide range of topics common in conversations about Jews, Israel, and antisemitism between January 2019 and December 2021. The dataset is drawn from representative samples during this period with relevant keywords. 1,250 tweets (18%) meet the IHRA definition of antisemitic messages.
The dataset has been compiled within the ISCA project using an annotation portal to label tweets as either antisemitic or non-antisemitic. The original data was sourced from annotationportal.com.
### Annotations
#### Annotation process
We annotated the tweets, considering the text, images, videos, and links, in their “natural” context, including threads. We used a detailed annotation guideline, based on the IHRA Definition, which has been endorsed and recommended by more than 30 governments and international organizations5 and is frequently used to monitor and record antisemitic incidents. We divided the definition into 12 paragraphs. Each of the paragraphs addresses different forms and tropes of antisemitism. We created an online annotation tool (https://annotationportal.com) to make labeling easier, more consistent, and less prone to errors, including in the process of recording the annotations. The portal displays the tweet and a clickable annotation form, see Figure 1. It automatically saves each annotation, including the time spent labeling each tweet.
The Annotation Portal retrieves live tweets by referencing their ID number. Our annotators first look at the tweet, and if they are unsure of the meaning, they are prompted to look at the entire thread, replies, likes, links, and comments. A click on the visualized tweet opens a new tab in the browser, displaying the message on the Twitter page in its “natural” environment.
The portal is designed to help annotators consistently label messages as antisemitic or not according to the IHRA definition. After verifying that the message is still live and in English, they select from a drop-down menu where they classify the message as "confident antisemitic," "probably antisemitic," "probably not antisemitic," "confident not antisemitic," or "don’t know." The annotation guideline, including the definition, is linked in a PDF document.
#### Who are the annotators?
All annotators are familiar with the definition and have been trained on test samples. They have also taken at least one academic course on antisemitism or have done research on antisemitism. We consider them to be expert annotators. Eight such expert annotators of different religions and genders labeled the 18 samples, two for each sample in alternating configurations.
## Considerations for Using the Data
### Social Impact of Dataset
One of the major challenges in automatic hate speech detection is the lack of datasets that cover a wide range of biased and unbiased messages and that are consistently labeled. We propose a labeling procedure that addresses some of the common weaknesses of labeled datasets.
We focus on antisemitic speech on Twitter and create a labeled dataset of 6,941 tweets that cover a wide range of topics common in conversations about Jews, Israel, and antisemitism between January 2019 and December 2021 by drawing from representative samples with relevant keywords.
Our annotation process aims to strictly apply a commonly used definition of antisemitism by forcing annotators to specify which part of the definition applies, and by giving them the option to personally disagree with the definition on a case-by-case basis. Labeling tweets that call out antisemitism, report antisemitism, or are otherwise related to antisemitism (such as the Holocaust) but are not actually antisemitic can help reduce false positives in automated detection.
## Additional Information
### Dataset Curators
Gunther Jikeli, Sameer Karali, Daniel Miehling, and Katharina Soemer
### Citation Information
Jikeli,Gunther, Sameer Karali, Daniel Miehling, and Katharina Soemer (2023): Antisemitic Messages? A Guide to High-Quality Annotation and a Labeled Dataset of Tweets. https://arxiv.org/abs/2304.14599
|
patched-codes/static-analysis-eval | 2023-10-02T09:09:06.000Z | [
"region:us"
] | patched-codes | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: source
dtype: string
- name: file_name
dtype: string
- name: cwe
dtype: string
splits:
- name: train
num_bytes: 87854
num_examples: 76
download_size: 53832
dataset_size: 87854
---
# Dataset Card for "static-analysis-eval"
A dataset of 76 Python programs taken from real Python open source projects (top 1000 on GitHub),
where each program is a file that has exactly 1 vulnerability as detected by a particular static analyzer (Semgrep). |
hmxiong/ScanRefer_Finetune | 2023-09-25T12:52:18.000Z | [
"region:us"
] | hmxiong | null | null | null | 0 | 0 | 本数据用于完成模型在ScanRefer上的Finetune工作
# V0
一共收集22735个reference,并找到对应的box
# V1
在V0的基础之上将box进行归一化操作
实验中使用的数据为V1版本 |
FremyCompany/OS-STS-nl-Dataset | 2023-09-22T08:36:12.000Z | [
"task_categories:sentence-similarity",
"size_categories:1M<n<10M",
"language:nl",
"license:other",
"region:us"
] | FremyCompany | null | null | null | 0 | 0 | ---
license: other
task_categories:
- sentence-similarity
language:
- nl
pretty_name: OpenSubtitles STS Dataset for Dutch
size_categories:
- 1M<n<10M
---
# OpenSubtitles STS Dataset for Dutch
OS-STS.nl is an extensive Dutch STS dataset containing over two million sentence pairs and similarity scores.
The dataset is automatically extracted from movie and documentary subtitles sourced from OpenSubtitles2018, a vast parallel corpus of aligned video subtitles.
Recognizing the high prevalence (>10%) of paraphrased statements and question-and-answer pairs in subtitled spoken language, we systematically extract the consecutive parallel sentence pairs from the subtitles that exhibit significant semantic overlap.
## Content of the dataset
The dataset contains Dutch sentence pairs, as well as semtatic similarity scores derived from their English translation derived from sentence-transformers/all-mpnet-base-v2.
<div style="max-width: 480px">

</div>
**Coming soon** |
Mihaj/ruoh_demo | 2023-09-22T10:08:30.000Z | [
"region:us"
] | Mihaj | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: mother_tongue
dtype: string
- name: region
dtype: string
- name: gender
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 1600232223.61
num_examples: 13198
- name: test
num_bytes: 405584868.6
num_examples: 3300
download_size: 1960524339
dataset_size: 2005817092.21
---
# Dataset Card for "ruoh_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seanghay/km_large_text | 2023-09-22T08:47:27.000Z | [
"region:us"
] | seanghay | null | null | null | 0 | 0 | Entry not found |
liwenlin123/9.8_idf_jsonl_3000.jsonl | 2023-09-22T08:57:53.000Z | [
"region:us"
] | liwenlin123 | null | null | null | 0 | 0 | Entry not found |
BangumiBase/tenpuru | 2023-09-29T11:12:45.000Z | [
"size_categories:n<1K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Tenpuru
This is the image base of bangumi Tenpuru, we detected 9 characters, 883 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 272 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 50 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 221 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 36 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 37 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 101 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 115 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 22 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 29 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
rahulmnavneeth/srdygsd | 2023-09-22T09:09:37.000Z | [
"region:us"
] | rahulmnavneeth | null | null | null | 0 | 0 | Entry not found |
FASOXO/SDCUI | 2023-10-03T01:38:00.000Z | [
"license:openrail",
"region:us"
] | FASOXO | null | null | null | 0 | 0 | ---
license: openrail
---
|
tmfi/mc4-ja | 2023-09-23T08:26:16.000Z | [
"region:us"
] | tmfi | null | null | null | 0 | 0 | Entry not found |
TokenBender/Bengali_chat_dataset | 2023-09-22T09:44:54.000Z | [
"license:apache-2.0",
"region:us"
] | TokenBender | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Coroseven/NinoNakano | 2023-09-22T09:40:02.000Z | [
"region:us"
] | Coroseven | null | null | null | 0 | 0 | Entry not found |
distil-whisper/whisper_transcriptions_greedy_timestamped | 2023-09-22T10:01:11.000Z | [
"region:us"
] | distil-whisper | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft | 2023-09-22T09:58:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of totally-not-an-llm/EverythingLM-13b-V3-peft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [totally-not-an-llm/EverythingLM-13b-V3-peft](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-peft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T09:57:21.290037](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft/blob/main/results_2023-09-22T09-57-21.290037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5480514597774847,\n\
\ \"acc_stderr\": 0.03489804084602524,\n \"acc_norm\": 0.5520933488863237,\n\
\ \"acc_norm_stderr\": 0.03487943342684165,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5297760407368329,\n\
\ \"mc2_stderr\": 0.016012808562402926\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5494880546075085,\n \"acc_stderr\": 0.014539646098471627,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436176\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6059549890460068,\n\
\ \"acc_stderr\": 0.0048764594346198,\n \"acc_norm\": 0.8102967536347341,\n\
\ \"acc_norm_stderr\": 0.003912649521823142\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.03028500925900979,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.03028500925900979\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.027666182075539645,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.027666182075539645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954932,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.019379436628919982,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.019379436628919982\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083292,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083292\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484255,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046735,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046735\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7330779054916986,\n\
\ \"acc_stderr\": 0.01581845089477754,\n \"acc_norm\": 0.7330779054916986,\n\
\ \"acc_norm_stderr\": 0.01581845089477754\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357629,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357629\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581982,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027125115513166854,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027125115513166854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.01244115532685493,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.01244115532685493\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.016750862381375898,\n \"mc2\": 0.5297760407368329,\n\
\ \"mc2_stderr\": 0.016012808562402926\n }\n}\n```"
repo_url: https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-peft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|arc:challenge|25_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hellaswag|10_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-57-21.290037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-57-21.290037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T09-57-21.290037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T09-57-21.290037.parquet'
- config_name: results
data_files:
- split: 2023_09_22T09_57_21.290037
path:
- results_2023-09-22T09-57-21.290037.parquet
- split: latest
path:
- results_2023-09-22T09-57-21.290037.parquet
---
# Dataset Card for Evaluation run of totally-not-an-llm/EverythingLM-13b-V3-peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [totally-not-an-llm/EverythingLM-13b-V3-peft](https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T09:57:21.290037](https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft/blob/main/results_2023-09-22T09-57-21.290037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5480514597774847,
"acc_stderr": 0.03489804084602524,
"acc_norm": 0.5520933488863237,
"acc_norm_stderr": 0.03487943342684165,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5297760407368329,
"mc2_stderr": 0.016012808562402926
},
"harness|arc:challenge|25": {
"acc": 0.5494880546075085,
"acc_stderr": 0.014539646098471627,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436176
},
"harness|hellaswag|10": {
"acc": 0.6059549890460068,
"acc_stderr": 0.0048764594346198,
"acc_norm": 0.8102967536347341,
"acc_norm_stderr": 0.003912649521823142
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.03028500925900979,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.03028500925900979
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539645,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.03452453903822039,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.03452453903822039
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954932,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.019379436628919982,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.019379436628919982
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083292,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083292
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046735,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046735
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7330779054916986,
"acc_stderr": 0.01581845089477754,
"acc_norm": 0.7330779054916986,
"acc_norm_stderr": 0.01581845089477754
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357629,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357629
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581982,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027125115513166854,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027125115513166854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.01244115532685493,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.01244115532685493
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.016750862381375898,
"mc2": 0.5297760407368329,
"mc2_stderr": 0.016012808562402926
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nuwandaa/teddy-bear | 2023-09-23T23:52:00.000Z | [
"region:us"
] | nuwandaa | null | null | null | 0 | 0 | Entry not found |
mboth/kaelteVersorgen-50-undersampled | 2023-09-22T10:14:36.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': KaelteErzeugen
'1': KaelteSpeichern
'2': KaelteVerteilen
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: Komponente
dtype: string
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 27642.555450236967
num_examples: 112
- name: test
num_bytes: 32271
num_examples: 132
- name: valid
num_bytes: 32271
num_examples: 132
download_size: 51628
dataset_size: 92184.55545023696
---
# Dataset Card for "kaelteVersorgen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/kaelteVersorgen-100-undersampled | 2023-09-22T10:14:40.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': KaelteErzeugen
'1': KaelteSpeichern
'2': KaelteVerteilen
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: Komponente
dtype: string
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 52323.40853080569
num_examples: 212
- name: test
num_bytes: 32271
num_examples: 132
- name: valid
num_bytes: 32271
num_examples: 132
download_size: 57973
dataset_size: 116865.4085308057
---
# Dataset Card for "kaelteVersorgen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/kaelteVersorgen-200-undersampled | 2023-09-22T10:14:44.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': KaelteErzeugen
'1': KaelteSpeichern
'2': KaelteVerteilen
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: Komponente
dtype: string
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 101685.11469194313
num_examples: 412
- name: test
num_bytes: 32271
num_examples: 132
- name: valid
num_bytes: 32271
num_examples: 132
download_size: 69781
dataset_size: 166227.11469194313
---
# Dataset Card for "kaelteVersorgen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NathanLiu2023/stable-diffusion-sdk | 2023-09-22T10:16:22.000Z | [
"license:apache-2.0",
"region:us"
] | NathanLiu2023 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
mboth/kaelteErzeugen-50-undersampled | 2023-09-22T10:28:54.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Kaelteanlage
'1': KaeltekreisAllgemein
'2': Kaeltemaschine
'3': Kaeltemengenzaehler
'4': Klappe
'5': Pumpe
'6': RKW
'7': Regler
'8': Ruecklauf
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 72126.24090121317
num_examples: 293
- name: test
num_bytes: 18282
num_examples: 73
- name: valid
num_bytes: 18282
num_examples: 73
download_size: 54220
dataset_size: 108690.24090121317
---
# Dataset Card for "kaelteErzeugen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/kaelteErzeugen-100-undersampled | 2023-09-22T10:28:58.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Kaelteanlage
'1': KaeltekreisAllgemein
'2': Kaeltemaschine
'3': Kaeltemengenzaehler
'4': Klappe
'5': Pumpe
'6': RKW
'7': Regler
'8': Ruecklauf
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 90342.424610052
num_examples: 367
- name: test
num_bytes: 18282
num_examples: 73
- name: valid
num_bytes: 18282
num_examples: 73
download_size: 58393
dataset_size: 126906.424610052
---
# Dataset Card for "kaelteErzeugen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/kaelteErzeugen-200-undersampled | 2023-09-22T10:29:03.000Z | [
"region:us"
] | mboth | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Kaelteanlage
'1': KaeltekreisAllgemein
'2': Kaeltemaschine
'3': Kaeltemengenzaehler
'4': Klappe
'5': Pumpe
'6': RKW
'7': Regler
'8': Ruecklauf
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
- name: ScoreKomponente
dtype: float64
- name: Datenpunkt
dtype: string
- name: ScoreDatenpunkt
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 114958.88908145581
num_examples: 467
- name: test
num_bytes: 18282
num_examples: 73
- name: valid
num_bytes: 18282
num_examples: 73
download_size: 63616
dataset_size: 151522.88908145583
---
# Dataset Card for "kaelteErzeugen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mihaj/ruohqa_demo | 2023-09-22T10:34:47.000Z | [
"region:us"
] | Mihaj | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: context
dtype: string
- name: id
dtype: string
- name: question
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 384292
num_examples: 968
- name: validation
num_bytes: 165616
num_examples: 416
download_size: 287881
dataset_size: 549908
---
# Dataset Card for "ruovaqa_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/medical-staff-people-tracking | 2023-10-09T07:55:26.000Z | [
"task_categories:image-to-image",
"task_categories:object-detection",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"medical",
"region:us"
] | TrainingDataPro | The dataset contains a collection of frames extracted from videos captured within a
**hospital environment**. The **bounding boxes** are drawn around the **doctors, nurses,
and other people** who appear in the video footage.
The dataset can be used for **computer vision in healthcare settings** and *the
development of systems that monitor medical staff activities, patient flow, analyze
wait times, and assess the efficiency of hospital processes*. | @InProceedings{huggingface:dataset,
title = {medical-staff-people-tracking},
author = {TrainingDataPro},
year = {2023}
} | null | 1 | 0 | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-to-image
- object-detection
tags:
- code
- medical
dataset_info:
- config_name: video_01
features:
- name: id
dtype: int32
- name: name
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: shapes
sequence:
- name: track_id
dtype: uint32
- name: label
dtype:
class_label:
names:
'0': nurse
'1': doctor
'2': other_people
- name: type
dtype: string
- name: points
sequence:
sequence: float32
- name: rotation
dtype: float32
- name: occluded
dtype: uint8
- name: attributes
sequence:
- name: name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 27856
num_examples: 64
download_size: 23409734
dataset_size: 27856
- config_name: video_02
features:
- name: id
dtype: int32
- name: name
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: shapes
sequence:
- name: track_id
dtype: uint32
- name: label
dtype:
class_label:
names:
'0': nurse
'1': doctor
'2': other_people
- name: type
dtype: string
- name: points
sequence:
sequence: float32
- name: rotation
dtype: float32
- name: occluded
dtype: uint8
- name: attributes
sequence:
- name: name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 37214
num_examples: 73
download_size: 28155019
dataset_size: 37214
---
# Medical Staff People Tracking
The dataset contains a collection of frames extracted from videos captured within a **hospital environment**. The **bounding boxes** are drawn around the **doctors, nurses, and other people** who appear in the video footage.
The dataset can be used for **computer vision in healthcare settings** and *the development of systems that monitor medical staff activities, patient flow, analyze wait times, and assess the efficiency of hospital processes*.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=medical-staff-people-tracking) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
The dataset consists of 2 folders with frames from the video from a hospital.
Each folder includes:
- **images**: folder with original frames from the video,
- **boxes**: visualized data labeling for the images in the previous folder,
- **.csv file**: file with id and path of each frame in the "images" folder,
- **annotations.xml**: contains coordinates of the bounding boxes, created for the original frames
# Data Format
Each frame from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes for people tracking. For each point, the x and y coordinates are provided.
### Classes:
- **doctor** - doctor in the frame
- **nurse** - nurse in the frame
- **others** - other people (not medical staff)
# Example of the XML-file
.png?generation=1695995011699193&alt=media)
# Object tracking might be made in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=medical-staff-people-tracking)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
evi1m0/testdataset | 2023-09-22T10:39:37.000Z | [
"license:artistic-2.0",
"region:us"
] | evi1m0 | null | null | null | 0 | 0 | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_Faradaylab__ARIA-70B-V3 | 2023-09-22T10:45:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Faradaylab/ARIA-70B-V3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Faradaylab/ARIA-70B-V3](https://huggingface.co/Faradaylab/ARIA-70B-V3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Faradaylab__ARIA-70B-V3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T10:43:51.211297](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V3/blob/main/results_2023-09-22T10-43-51.211297.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6471664219890731,\n\
\ \"acc_stderr\": 0.03252894231531827,\n \"acc_norm\": 0.651041907545905,\n\
\ \"acc_norm_stderr\": 0.0325031101698762,\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.513240508208704,\n\
\ \"mc2_stderr\": 0.015101415537603125\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809174,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6726747659828719,\n\
\ \"acc_stderr\": 0.004682780790508322,\n \"acc_norm\": 0.8620792670782712,\n\
\ \"acc_norm_stderr\": 0.0034411206110598396\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055284,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055284\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634342,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634342\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281732,\n \"\
acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8774509803921569,\n \"acc_stderr\": 0.023015389732458258,\n \"\
acc_norm\": 0.8774509803921569,\n \"acc_norm_stderr\": 0.023015389732458258\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.02362715946031868,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.02362715946031868\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.02944249558585747,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.02944249558585747\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990905,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990905\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904664,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904664\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.025218040373410616,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.025218040373410616\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n\
\ \"acc_stderr\": 0.012767098998525852,\n \"acc_norm\": 0.48891786179921776,\n\
\ \"acc_norm_stderr\": 0.012767098998525852\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6977124183006536,\n \"acc_stderr\": 0.018579232711113877,\n \
\ \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.018579232711113877\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142787,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142787\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900836,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900836\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34149326805385555,\n\
\ \"mc1_stderr\": 0.016600688619950826,\n \"mc2\": 0.513240508208704,\n\
\ \"mc2_stderr\": 0.015101415537603125\n }\n}\n```"
repo_url: https://huggingface.co/Faradaylab/ARIA-70B-V3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|arc:challenge|25_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hellaswag|10_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T10-43-51.211297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T10-43-51.211297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T10-43-51.211297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T10-43-51.211297.parquet'
- config_name: results
data_files:
- split: 2023_09_22T10_43_51.211297
path:
- results_2023-09-22T10-43-51.211297.parquet
- split: latest
path:
- results_2023-09-22T10-43-51.211297.parquet
---
# Dataset Card for Evaluation run of Faradaylab/ARIA-70B-V3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Faradaylab/ARIA-70B-V3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Faradaylab/ARIA-70B-V3](https://huggingface.co/Faradaylab/ARIA-70B-V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Faradaylab__ARIA-70B-V3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T10:43:51.211297](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V3/blob/main/results_2023-09-22T10-43-51.211297.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6471664219890731,
"acc_stderr": 0.03252894231531827,
"acc_norm": 0.651041907545905,
"acc_norm_stderr": 0.0325031101698762,
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.513240508208704,
"mc2_stderr": 0.015101415537603125
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809174,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6726747659828719,
"acc_stderr": 0.004682780790508322,
"acc_norm": 0.8620792670782712,
"acc_norm_stderr": 0.0034411206110598396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055284,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055284
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634342,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634342
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458258,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458258
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.02362715946031868,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.02362715946031868
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.02944249558585747,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.02944249558585747
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990905,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990905
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904664,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904664
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.025218040373410616,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.025218040373410616
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.012767098998525852,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.012767098998525852
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6977124183006536,
"acc_stderr": 0.018579232711113877,
"acc_norm": 0.6977124183006536,
"acc_norm_stderr": 0.018579232711113877
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142787,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142787
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900836,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900836
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34149326805385555,
"mc1_stderr": 0.016600688619950826,
"mc2": 0.513240508208704,
"mc2_stderr": 0.015101415537603125
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NexaAI/Armchair | 2023-09-22T11:02:19.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
MyRebRIc/mcig2 | 2023-09-22T10:59:58.000Z | [
"region:us"
] | MyRebRIc | null | null | null | 0 | 0 | Entry not found |
TalTechNLP/ERR_news_newsroom | 2023-09-22T11:15:06.000Z | [
"license:cc-by-4.0",
"region:us"
] | TalTechNLP | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
NexaAI/Bed | 2023-09-25T06:56:09.000Z | [
"region:us"
] | NexaAI | null | null | null | 0 | 0 | Entry not found |
Gboparoobop/1 | 2023-09-22T11:14:34.000Z | [
"task_categories:feature-extraction",
"task_categories:text-classification",
"task_categories:token-classification",
"license:creativeml-openrail-m",
"biology",
"art",
"region:us"
] | Gboparoobop | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
task_categories:
- feature-extraction
- text-classification
- token-classification
tags:
- biology
- art
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TDKMBL/aynen | 2023-09-22T11:18:56.000Z | [
"region:us"
] | TDKMBL | null | null | null | 0 | 0 | Entry not found |
qgyd2021/few_shot_intent_sft | 2023-10-10T12:11:07.000Z | [
"task_categories:text-classification",
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:100M<n<1B",
"language:zh",
"language:en",
"license:apache-2.0",
"few-shot",
"intent",
"region:us"
] | qgyd2021 | null | @dataset{few_shot_intent_sft,
author = {Xing Tian},
title = {few_shot_intent_sft},
month = sep,
year = 2023,
publisher = {Xing Tian},
version = {1.0},
} | null | 1 | 0 | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
- text2text-generation
language:
- zh
- en
tags:
- few-shot
- intent
size_categories:
- 100M<n<1B
---
## 小样本意图识别指令数据集
用于 few-shot 的意图识别 LLM 研究
```text
https://huggingface.co/datasets/fathyshalab/atis_intents
https://huggingface.co/datasets/generalization/conv_intent_Full-p_1
https://huggingface.co/datasets/banking77
https://huggingface.co/datasets/dipesh/Intent-Classification-large
https://huggingface.co/datasets/SetFit/amazon_massive_intent_en-US
https://huggingface.co/datasets/SetFit/amazon_massive_intent_zh-CN
https://huggingface.co/datasets/SetFit/amazon_massive_intent_zh-TW
https://huggingface.co/datasets/snips_built_in_intents
https://huggingface.co/datasets/zapsdcn/citation_intent
https://huggingface.co/datasets/ibm/vira-intents
https://huggingface.co/datasets/mteb/mtop_intent
https://huggingface.co/datasets/Bhuvaneshwari/intent_classification
https://huggingface.co/datasets/ibm/vira-intents-live
https://huggingface.co/datasets/ebrigham/nl_banking_intents
https://pan.baidu.com/s/19_oqY4bC_lJa_7Mc6lxU7w?pwd=v4bi
https://gitee.com/a2798063/SMP2019/tree/master
```
|
Gboparoobop/2 | 2023-09-22T11:52:16.000Z | [
"license:openrail",
"region:us"
] | Gboparoobop | null | null | null | 0 | 0 | ---
license: openrail
---
|
demizzzzzz/arda_turan | 2023-09-22T12:13:56.000Z | [
"region:us"
] | demizzzzzz | null | null | null | 0 | 0 | Entry not found |
Amgalan/final | 2023-09-22T12:22:54.000Z | [
"region:us"
] | Amgalan | null | null | null | 0 | 0 | Entry not found |
Zerenidel/Trio | 2023-09-22T12:32:42.000Z | [
"region:us"
] | Zerenidel | null | null | null | 0 | 0 | Entry not found |
davanstrien/modeldb | 2023-09-22T12:34:40.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | Entry not found |
p1atdev/fake-news-jp | 2023-09-22T12:54:43.000Z | [
"size_categories:10K<n<100K",
"language:ja",
"license:cc-by-2.5",
"region:us"
] | p1atdev | 日本語のニュース記事と、GPT-2日本語版のモデルで生成された、ディープフェイク記事からなるデータセットです。 | \ | null | 0 | 0 | ---
license: cc-by-2.5
language:
- ja
size_categories:
- 10K<n<100K
---
# 日本語フェイクニュースデータセット
[日本語フェイクニュースデータセット](https://github.com/tanreinama/Japanese-Fakenews-Dataset) を HuggingFace datasets 用に変換。
## ラベル
- id: 一意なID
- context: 本文
- fake_type: 真実なら `real`、途中からAI生成(GPT-2) なら `partial_gpt2`、すべて GPT-2 なら `full_gpt2`
- nchar_real: 真実部分の文字数
- nchar_fake: フェイク部分の文字数
|
BiancoMat/metamat | 2023-09-22T17:10:12.000Z | [
"art",
"region:us"
] | BiancoMat | null | null | null | 0 | 0 | ---
tags:
- art
--- |
OmerhanSelman/feyzullahask | 2023-09-22T12:52:26.000Z | [
"license:openrail",
"region:us"
] | OmerhanSelman | null | null | null | 0 | 0 | ---
license: openrail
---
|
MetamatSoul/Souls | 2023-09-22T12:54:55.000Z | [
"license:unknown",
"region:us"
] | MetamatSoul | null | null | null | 0 | 0 | ---
license: unknown
---
|
thick99/oo | 2023-09-22T13:07:34.000Z | [
"license:bigcode-openrail-m",
"region:us"
] | thick99 | null | null | null | 1 | 0 | ---
license: bigcode-openrail-m
---
|
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1 | 2023-09-22T13:09:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-LM-70B-V0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-LM-70B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T13:08:23.293621](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1/blob/main/results_2023-09-22T13-08-23.293621.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6969031190908623,\n\
\ \"acc_stderr\": 0.03089637267795339,\n \"acc_norm\": 0.7007672507029784,\n\
\ \"acc_norm_stderr\": 0.030866151076173128,\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5985719496292411,\n\
\ \"mc2_stderr\": 0.015159352218131503\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.01384746051889298,\n\
\ \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.013363080107244487\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n\
\ \"acc_stderr\": 0.004626805906522212,\n \"acc_norm\": 0.8725353515236008,\n\
\ \"acc_norm_stderr\": 0.0033281118131353823\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.031546980450822305,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.031546980450822305\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216773,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216773\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.022815813098896607,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.022815813098896607\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02865749128507196,\n \
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02865749128507196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \"\
acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.0321782942074463,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.0321782942074463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\"\
: 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517964,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517964\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899091,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899091\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5318435754189944,\n\
\ \"acc_stderr\": 0.016688553415612217,\n \"acc_norm\": 0.5318435754189944,\n\
\ \"acc_norm_stderr\": 0.016688553415612217\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257114,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5384615384615384,\n\
\ \"acc_stderr\": 0.01273239828619043,\n \"acc_norm\": 0.5384615384615384,\n\
\ \"acc_norm_stderr\": 0.01273239828619043\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7598039215686274,\n \"acc_stderr\": 0.017282760695167404,\n \
\ \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.017282760695167404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546188,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546188\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759416,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759416\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n\
\ \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5985719496292411,\n\
\ \"mc2_stderr\": 0.015159352218131503\n }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|arc:challenge|25_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hellaswag|10_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T13-08-23.293621.parquet'
- config_name: results
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- results_2023-09-22T13-08-23.293621.parquet
- split: latest
path:
- results_2023-09-22T13-08-23.293621.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-70B-V0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-70B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:08:23.293621](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1/blob/main/results_2023-09-22T13-08-23.293621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6969031190908623,
"acc_stderr": 0.03089637267795339,
"acc_norm": 0.7007672507029784,
"acc_norm_stderr": 0.030866151076173128,
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5985719496292411,
"mc2_stderr": 0.015159352218131503
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.01384746051889298,
"acc_norm": 0.7022184300341296,
"acc_norm_stderr": 0.013363080107244487
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522212,
"acc_norm": 0.8725353515236008,
"acc_norm_stderr": 0.0033281118131353823
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.031546980450822305,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.031546980450822305
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216773,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216773
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.022815813098896607,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.022815813098896607
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02865749128507196,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02865749128507196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.0321782942074463,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.0321782942074463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517964,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517964
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899091,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899091
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5318435754189944,
"acc_stderr": 0.016688553415612217,
"acc_norm": 0.5318435754189944,
"acc_norm_stderr": 0.016688553415612217
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257114,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.01273239828619043,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.01273239828619043
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.017282760695167404,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.017282760695167404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546188,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759416,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759416
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40269277845777235,
"mc1_stderr": 0.01716883093518722,
"mc2": 0.5985719496292411,
"mc2_stderr": 0.015159352218131503
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hdeldar/Persian-Text-llama2-1k-2 | 2023-09-22T13:13:52.000Z | [
"region:us"
] | hdeldar | null | null | null | 0 | 0 | Entry not found |
hdeldar/Persian-Text-llama2-1k-3 | 2023-09-22T13:15:04.000Z | [
"region:us"
] | hdeldar | null | null | null | 0 | 0 | Entry not found |
hdeldar/Persian-Text-llama2-1k-4 | 2023-09-22T13:15:18.000Z | [
"region:us"
] | hdeldar | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_zarakiquemparte__zarablend-l2-7b | 2023-09-22T13:27:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zarablend-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zarablend-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarablend-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:26:53.178653](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-l2-7b/blob/main/results_2023-09-22T13-26-53.178653.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2753775167785235,\n\
\ \"em_stderr\": 0.00457467023556627,\n \"f1\": 0.354505033557049,\n\
\ \"f1_stderr\": 0.004527443322138582,\n \"acc\": 0.3886004022324439,\n\
\ \"acc_stderr\": 0.009038856275635394\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2753775167785235,\n \"em_stderr\": 0.00457467023556627,\n\
\ \"f1\": 0.354505033557049,\n \"f1_stderr\": 0.004527443322138582\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04397270659590599,\n \
\ \"acc_stderr\": 0.005647666449126459\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.01243004610214433\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zarablend-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_26_53.178653
path:
- '**/details_harness|drop|3_2023-09-22T13-26-53.178653.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-26-53.178653.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_26_53.178653
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-26-53.178653.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-26-53.178653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_26_53.178653
path:
- '**/details_harness|winogrande|5_2023-09-22T13-26-53.178653.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-26-53.178653.parquet'
- config_name: results
data_files:
- split: 2023_09_22T13_26_53.178653
path:
- results_2023-09-22T13-26-53.178653.parquet
- split: latest
path:
- results_2023-09-22T13-26-53.178653.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zarablend-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zarablend-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zarablend-l2-7b](https://huggingface.co/zarakiquemparte/zarablend-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarablend-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:26:53.178653](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarablend-l2-7b/blob/main/results_2023-09-22T13-26-53.178653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2753775167785235,
"em_stderr": 0.00457467023556627,
"f1": 0.354505033557049,
"f1_stderr": 0.004527443322138582,
"acc": 0.3886004022324439,
"acc_stderr": 0.009038856275635394
},
"harness|drop|3": {
"em": 0.2753775167785235,
"em_stderr": 0.00457467023556627,
"f1": 0.354505033557049,
"f1_stderr": 0.004527443322138582
},
"harness|gsm8k|5": {
"acc": 0.04397270659590599,
"acc_stderr": 0.005647666449126459
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.01243004610214433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dikw/gold_open_sft_data | 2023-09-22T13:42:46.000Z | [
"license:apache-2.0",
"region:us"
] | dikw | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Roscall/emmasmith-rvc | 2023-09-22T14:12:13.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
linhtran92/infer_fix_70 | 2023-09-22T14:17:28.000Z | [
"region:us"
] | linhtran92 | null | null | null | 0 | 0 | Entry not found |
abaditya26/prakriti | 2023-09-22T14:20:18.000Z | [
"license:apache-2.0",
"region:us"
] | abaditya26 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
newsmediabias/GPT_synthetic_social_media_data | 2023-10-03T22:50:21.000Z | [
"doi:10.57967/hf/1138",
"region:us"
] | newsmediabias | null | null | null | 0 | 0 | Entry not found |
newsmediabias/Social_media_cleaned_data | 2023-10-03T00:25:02.000Z | [
"region:us"
] | newsmediabias | null | null | null | 0 | 0 | Entry not found |
alexmoini/simon_sinek_dataset | 2023-09-23T16:05:00.000Z | [
"region:us"
] | alexmoini | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chunk_name
dtype: string
- name: conversation
dtype: string
- name: speech_type
dtype: string
splits:
- name: train
num_bytes: 1899282
num_examples: 325
download_size: 851140
dataset_size: 1899282
---
# Dataset Card for "simon_sinek_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fmattera/test_data2 | 2023-09-22T14:31:13.000Z | [
"region:us"
] | fmattera | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: prompt
sequence: string
splits:
- name: train
num_bytes: 3854203.0
num_examples: 4
download_size: 3857683
dataset_size: 3854203.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_data2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Karthikeyan0123/en-ta | 2023-09-22T14:53:02.000Z | [
"license:openrail",
"region:us"
] | Karthikeyan0123 | null | null | null | 0 | 0 | ---
license: openrail
---
|
infCapital/vnnews-txt-corpus | 2023-09-22T16:02:49.000Z | [
"language:vi",
"license:cc",
"finance",
"chemistry",
"art",
"region:us"
] | infCapital | null | null | null | 0 | 0 | ---
license: cc
language:
- vi
tags:
- finance
- chemistry
- art
---
VNNews TXT raw corpus |
DSSGxMunich/nrw-bplan-scrape | 2023-10-09T09:17:58.000Z | [
"license:mit",
"region:us"
] | DSSGxMunich | null | null | null | 0 | 0 | ---
license: mit
---
# Dataset Card for nrw-bplan-scrape
## Dataset Description
**Homepage:** [DSSGx Munich](https://sites.google.com/view/dssgx-munich-2023/startseite) organization page.
**Repository:** [GitHub](https://github.com/DSSGxMunich/land-sealing-dataset-and-analysis).
### Dataset Summary
This dataset contains all inputs needed as well as outputs of running the full pipeline for creating the NRW land sealing dataset. This can be reproduced by running [this notebook](https://github.com/DSSGxMunich/land-sealing-dataset-and-analysis/blob/main/src/1_execute_pipeline.ipynb).
## Dataset Structure
* nrw
* bplan
* features
* keywords
* exact_search
* ```baunvo_keywords.csv```: Results y/n of keywords found in documents relating to baunvo and article 13b.
* ```hochwasser_keywords.csv```: Results of keywords found in documents relating to "hochwasser", e.g. hqhäufig and hq100
* fuzzy_search:
* ```keyword_dict_hochwasser.json```: **to do**
* contains 7 csv files with results of fuzzy key search for keywords. The file name indicates the key being searched for and the text around this keyword is extracted in a row for each document
* raw
* images: images from [here](https://huggingface.co/datasets/DSSGxMunich/nrw-bplan-images) can be added to this folder
* links:
* ```NRW_BP.geojson```: The file downloaded from the NRV geoportal, containing all raw data on URLs to land parcel bplans.
* ```land_parcels.geojson```: A processed version of NRW_BP.geojson
* ```NRW_BP_parsed_links.csv```: A csv formatted version of NRW_BP.geojson.
* text:
* ```bp_text.json```: Raw output of the text text extraction of each pdf. Contains only columns for the filename and the extracted text.
* ```document_texts.json```: Enriched version of bp_texts.json in which columns about the documents have been appended.
* pdfs: pdfs extarcted from the NRW Geoportal and are found [here](https://huggingface.co/datasets/DSSGxMunich/nrw-bplan-pdfs), can be added to this folder
* knowledge_extraction_agent: Contains 6 json files. The filename corresponds to the key looked for in the fuzzy keyword search (e.g. ```fh.json``` cooresponds to ```firsthöhe.csv```, ```gfz.json``` corrresponds to ```geschossflächenzahl.csv```). More unfo can be found [here](https://huggingface.co/datasets/DSSGxMunich/bplan_keyword_extraction)
* ```knowledge_agent_output.json```: Is a toy example for 10 files of the output of the pipeline for the knowledge agent (merging of results in ```nrw/bplan/knowledge_extraction_agent```)
* clean
* ```document_texts.xlsx```: See [here](https://huggingface.co/datasets/DSSGxMunich/document_text) for more information
* ```exact_keyword.xlsx```: **to clarify**: this corresponds to baunvo_keywords.csv **not** merged results of exact search tables (baunvo&hochwasser) - this is unclear; either hochwasser keywords should be joined or file should be renames
* ```fuzzy_keyword.xlsx```: Is the merged version of the files found in ```nrw/bplan/fuzzy_search````
* ```knowledge_agent.xlsx```: The .xlsx version of ```nrw/bplan/knowledge_agent_output.json```)
* ```land_parcels.xlsx```: See [here](https://huggingface.co/datasets/DSSGxMunich/land_parcels) for more information
* ```regional_plans.xlsx```: The .xlsx version of the data table found [here](https://huggingface.co/datasets/DSSGxMunich/regional_plan_sections)
* rplan
* features: contains ```regional_plan_sections.json```, the output of the pipeline - a more detailed can be found [here](https://huggingface.co/datasets/DSSGxMunich/regional_plan_sections)
* raw
* geo: contains ```regions_map.geojson``` with information on the geolocations of the regional plans
* pdfs: contains pdfs of regional plans for NRW - used as input to run the pipeline
* text: contains text extracted with Tika from all pdf regional plans
|
Atheer174/products_NER | 2023-09-22T15:32:52.000Z | [
"region:us"
] | Atheer174 | null | null | null | 0 | 0 | Entry not found |
marcosguilherme/myDataSets | 2023-10-03T18:55:47.000Z | [
"region:us"
] | marcosguilherme | null | null | null | 0 | 0 | Entry not found |
Tsuinzues/dataset-alfredo-martins | 2023-09-22T15:38:30.000Z | [
"license:openrail",
"region:us"
] | Tsuinzues | null | null | null | 0 | 0 | ---
license: openrail
---
|
MyRebRIc/mcig45 | 2023-09-22T15:47:49.000Z | [
"region:us"
] | MyRebRIc | null | null | null | 0 | 0 | Entry not found |
Matheus30cs/FakeCrash | 2023-09-22T18:03:23.000Z | [
"region:us"
] | Matheus30cs | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_FabbriSimo01__Cerebras_1.3b_Quantized | 2023-09-22T16:09:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of FabbriSimo01/Cerebras_1.3b_Quantized
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FabbriSimo01/Cerebras_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Cerebras_1.3b_Quantized)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FabbriSimo01__Cerebras_1.3b_Quantized\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T16:08:53.530245](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Cerebras_1.3b_Quantized/blob/main/results_2023-09-22T16-08-53.530245.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335628,\n \"f1\": 0.03707739093959742,\n\
\ \"f1_stderr\": 0.0010591502361020477,\n \"acc\": 0.2694565433979606,\n\
\ \"acc_stderr\": 0.007855236930515893\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335628,\n\
\ \"f1\": 0.03707739093959742,\n \"f1_stderr\": 0.0010591502361020477\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401502038\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5351223362273086,\n \"acc_stderr\": 0.014017773120881582\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FabbriSimo01/Cerebras_1.3b_Quantized
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T16_08_53.530245
path:
- '**/details_harness|drop|3_2023-09-22T16-08-53.530245.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T16-08-53.530245.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T16_08_53.530245
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-08-53.530245.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-08-53.530245.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T16_08_53.530245
path:
- '**/details_harness|winogrande|5_2023-09-22T16-08-53.530245.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T16-08-53.530245.parquet'
- config_name: results
data_files:
- split: 2023_09_22T16_08_53.530245
path:
- results_2023-09-22T16-08-53.530245.parquet
- split: latest
path:
- results_2023-09-22T16-08-53.530245.parquet
---
# Dataset Card for Evaluation run of FabbriSimo01/Cerebras_1.3b_Quantized
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FabbriSimo01/Cerebras_1.3b_Quantized
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FabbriSimo01/Cerebras_1.3b_Quantized](https://huggingface.co/FabbriSimo01/Cerebras_1.3b_Quantized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FabbriSimo01__Cerebras_1.3b_Quantized",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T16:08:53.530245](https://huggingface.co/datasets/open-llm-leaderboard/details_FabbriSimo01__Cerebras_1.3b_Quantized/blob/main/results_2023-09-22T16-08-53.530245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335628,
"f1": 0.03707739093959742,
"f1_stderr": 0.0010591502361020477,
"acc": 0.2694565433979606,
"acc_stderr": 0.007855236930515893
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335628,
"f1": 0.03707739093959742,
"f1_stderr": 0.0010591502361020477
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401502038
},
"harness|winogrande|5": {
"acc": 0.5351223362273086,
"acc_stderr": 0.014017773120881582
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
LauCOLL1/checkpoint | 2023-09-22T16:57:50.000Z | [
"region:us"
] | LauCOLL1 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF | 2023-09-22T17:04:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheTravellingEngineer/bloom-560m-RLHF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheTravellingEngineer/bloom-560m-RLHF](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:04:20.598203](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF/blob/main/results_2023-09-22T17-04-20.598203.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n\
\ \"em_stderr\": 0.0005441551135493922,\n \"f1\": 0.0398909395973155,\n\
\ \"f1_stderr\": 0.0011867178799463702,\n \"acc\": 0.26710430338450897,\n\
\ \"acc_stderr\": 0.007769858100932032\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493922,\n\
\ \"f1\": 0.0398909395973155,\n \"f1_stderr\": 0.0011867178799463702\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.001514573561224551\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639513\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_04_20.598203
path:
- '**/details_harness|drop|3_2023-09-22T17-04-20.598203.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-04-20.598203.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_04_20.598203
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-04-20.598203.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-04-20.598203.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_04_20.598203
path:
- '**/details_harness|winogrande|5_2023-09-22T17-04-20.598203.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-04-20.598203.parquet'
- config_name: results
data_files:
- split: 2023_09_22T17_04_20.598203
path:
- results_2023-09-22T17-04-20.598203.parquet
- split: latest
path:
- results_2023-09-22T17-04-20.598203.parquet
---
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-560m-RLHF](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:04:20.598203](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF/blob/main/results_2023-09-22T17-04-20.598203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493922,
"f1": 0.0398909395973155,
"f1_stderr": 0.0011867178799463702,
"acc": 0.26710430338450897,
"acc_stderr": 0.007769858100932032
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493922,
"f1": 0.0398909395973155,
"f1_stderr": 0.0011867178799463702
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.001514573561224551
},
"harness|winogrande|5": {
"acc": 0.5311760063141279,
"acc_stderr": 0.014025142640639513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hearmeneigh/e621-rising-v3-preliminary-data | 2023-10-09T18:42:40.000Z | [
"furry",
"anthro",
"nsfw",
"e621",
"not-for-all-audiences",
"region:us"
] | hearmeneigh | null | null | null | 0 | 0 | ---
dataset_info:
pretty_name: 'E621 Rising V3: Preliminary Data'
viewer: false
tags:
- furry
- anthro
- nsfw
- e621
- not-for-all-audiences
---
# E621 Rising V3: Preliminary Data
Snapshot metadata from E621.net as of 2023-09-21
|
Xenova/semantic-image-search-assets | 2023-09-22T20:37:30.000Z | [
"region:us"
] | Xenova | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.