id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 6.67k ⌀ | citation stringlengths 0 10.7k ⌀ | likes int64 0 3.66k | downloads int64 0 8.89M | created timestamp[us] | card stringlengths 11 977k | card_len int64 11 977k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|
tinhpx2911/vanhoc_processed | 2023-10-11T10:12:47.000Z | [
"region:us"
] | tinhpx2911 | null | null | 0 | 0 | 2023-10-11T10:11:52 | ---
dataset_info:
features:
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 161543279
num_examples: 28242
download_size: 81656333
dataset_size: 161543279
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vanhoc_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 484 | [
[
-0.038238525390625,
-0.0411376953125,
0.025177001953125,
0.019195556640625,
-0.0127716064453125,
-0.007297515869140625,
0.0229949951171875,
-0.0242767333984375,
0.059234619140625,
0.058868408203125,
-0.06280517578125,
-0.06842041015625,
-0.0367431640625,
-0.... |
autoevaluate/autoeval-eval-banking77-default-c7e778-94421146088 | 2023-10-11T10:39:14.000Z | [
"autotrain",
"evaluation",
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-11T10:38:35 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- banking77
eval_info:
task: multi_class_classification
model: thainq107/bert-base-banking77-pt2
metrics: []
dataset_name: banking77
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: thainq107/bert-base-banking77-pt2
* Dataset: banking77
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@cnxt](https://huggingface.co/cnxt) for evaluating this model. | 850 | [
[
-0.0231475830078125,
-0.02130126953125,
0.0170135498046875,
0.0157928466796875,
-0.00870513916015625,
-0.00042891502380371094,
-0.003971099853515625,
-0.039642333984375,
-0.00818634033203125,
0.039093017578125,
-0.052490234375,
-0.0172271728515625,
-0.0563354492... |
rntc/pubmed_preprocess | 2023-10-11T13:15:21.000Z | [
"region:us"
] | rntc | null | null | 0 | 0 | 2023-10-11T10:40:27 | ---
configs:
- config_name: default
data_files:
- split: fr
path: data/fr-*
- split: en
path: data/en-*
- split: es
path: data/es-*
- split: de
path: data/de-*
- split: it
path: data/it-*
- split: nl
path: data/nl-*
- split: pl
path: data/pl-*
- split: pt
path: data/pt-*
- split: ro
path: data/ro-*
- split: ru
path: data/ru-*
- split: zh
path: data/zh-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: fr
num_bytes: 30582169
num_examples: 28715
- name: en
num_bytes: 90868163767
num_examples: 97816514
- name: es
num_bytes: 9925215
num_examples: 14671
- name: de
num_bytes: 46540591
num_examples: 53202
- name: it
num_bytes: 79767
num_examples: 125
- name: nl
num_bytes: 373829
num_examples: 461
- name: pl
num_bytes: 727984
num_examples: 877
- name: pt
num_bytes: 29942156
num_examples: 44558
- name: ro
num_bytes: 103813
num_examples: 187
- name: ru
num_bytes: 2320647
num_examples: 1671
- name: zh
num_bytes: 11481632
num_examples: 10612
download_size: 302082086
dataset_size: 91000241570
---
# Dataset Card for "pubmed_preprocess"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 1,380 | [
[
-0.028656005859375,
-0.00910186767578125,
0.03973388671875,
0.0149383544921875,
-0.030487060546875,
-0.0034999847412109375,
0.0114288330078125,
0.00171661376953125,
0.06317138671875,
0.043121337890625,
-0.0518798828125,
-0.06103515625,
-0.04388427734375,
0.0... |
shivamchawla/giftcartbot | 2023-10-11T10:41:09.000Z | [
"region:us"
] | shivamchawla | null | null | 0 | 0 | 2023-10-11T10:41:09 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
Kamyar-zeinalipour/EN_CW | 2023-10-11T12:55:52.000Z | [
"region:us"
] | Kamyar-zeinalipour | null | null | 0 | 0 | 2023-10-11T10:43:09 | ---
dataset_info:
features:
- name: date
dtype: string
- name: answer
dtype: string
- name: clue
dtype: string
- name: partial
dtype: bool
- name: couple_occurencies
dtype: int64
splits:
- name: train
num_bytes: 387434957
num_examples: 7327448
download_size: 188270614
dataset_size: 387434957
---
# Dataset Card for "EN_CW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 504 | [
[
-0.03948974609375,
-0.01800537109375,
0.016082763671875,
0.02801513671875,
-0.018646240234375,
0.0024242401123046875,
0.007488250732421875,
-0.0240020751953125,
0.056732177734375,
0.03912353515625,
-0.0714111328125,
-0.052978515625,
-0.0380859375,
-0.0025196... |
daspartho/demo_dataset | 2023-10-11T11:17:10.000Z | [
"region:us"
] | daspartho | null | null | 0 | 0 | 2023-10-11T11:17:07 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1634
num_examples: 25
download_size: 2287
dataset_size: 1634
---
# Dataset Card for "demo_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 463 | [
[
-0.0394287109375,
-0.01216888427734375,
0.01000213623046875,
0.01324462890625,
-0.021026611328125,
0.00287628173828125,
0.016021728515625,
0.0029010772705078125,
0.06207275390625,
0.02423095703125,
-0.0687255859375,
-0.059967041015625,
-0.032379150390625,
-0... |
deepghs/reg_experiment | 2023-10-31T11:39:57.000Z | [
"region:us"
] | deepghs | null | null | 1 | 0 | 2023-10-11T11:17:31 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
daspartho/spoiler_or_not | 2023-10-11T11:50:42.000Z | [
"region:us"
] | daspartho | null | null | 0 | 0 | 2023-10-11T11:45:53 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1657
num_examples: 25
download_size: 2423
dataset_size: 1657
---
# Dataset Card for "spoiler_or_not"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 465 | [
[
-0.03192138671875,
-0.0177459716796875,
0.025054931640625,
0.0031070709228515625,
-0.0200958251953125,
-0.012847900390625,
0.0263671875,
-0.005764007568359375,
0.059051513671875,
0.025634765625,
-0.0806884765625,
-0.0489501953125,
-0.052947998046875,
-0.0010... |
makram93/testing | 2023-10-11T11:56:29.000Z | [
"region:us"
] | makram93 | null | null | 0 | 0 | 2023-10-11T11:56:27 | ---
dataset_info:
features:
- name: url
dtype: string
- name: doc_id
dtype: string
- name: title
sequence: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 679123
num_examples: 824
download_size: 388552
dataset_size: 679123
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 538 | [
[
-0.04791259765625,
-0.0298309326171875,
0.007068634033203125,
0.0179443359375,
-0.006744384765625,
-0.0013837814331054688,
0.016143798828125,
-0.00974273681640625,
0.0447998046875,
0.0214996337890625,
-0.05780029296875,
-0.04449462890625,
-0.0298004150390625,
... |
bongo2112/alikiba-SDxl-Video-Outputs | 2023-10-11T14:03:06.000Z | [
"region:us"
] | bongo2112 | null | null | 0 | 0 | 2023-10-11T12:16:16 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
bongo2112/mixed-SDXL-Video-Outputs | 2023-10-11T14:16:12.000Z | [
"region:us"
] | bongo2112 | null | null | 0 | 0 | 2023-10-11T12:30:24 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
autoevaluate/autoeval-eval-acronym_identification-default-0665e6-94440146094 | 2023-10-11T12:44:38.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-11T12:44:34 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
BUDDI-AI/Speeding-up-LIME | 2023-10-11T14:47:26.000Z | [
"language:en",
"license:cc-by-nc-nd-4.0",
"region:us"
] | BUDDI-AI | null | null | 0 | 0 | 2023-10-11T12:51:00 | ---
license: cc-by-nc-nd-4.0
language:
- en
pretty_name: b
---
# Deidentified Medical Charts with Human Curated Explanations
*About*
This dataset is a small sample from the EHR dataset used by experiments described in our paper, "Speeding up LIME with Attention Weights," submitted to CoDS-COMAD 2024.
| 304 | [
[
-0.007282257080078125,
-0.037994384765625,
0.047607421875,
-0.00980377197265625,
-0.02301025390625,
-0.00902557373046875,
0.0167236328125,
-0.0194854736328125,
0.027984619140625,
0.056488037109375,
-0.061798095703125,
-0.022674560546875,
-0.0200958251953125,
... |
open-llm-leaderboard/details_lgaalves__mistral-7b-platypus1k | 2023-10-11T13:00:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T12:59:12 | ---
pretty_name: Evaluation run of lgaalves/mistral-7b-platypus1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/mistral-7b-platypus1k](https://huggingface.co/lgaalves/mistral-7b-platypus1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__mistral-7b-platypus1k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T12:58:49.551109](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__mistral-7b-platypus1k/blob/main/results_2023-10-11T12-58-49.551109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304998086039316,\n\
\ \"acc_stderr\": 0.033058893340663746,\n \"acc_norm\": 0.6346959437303671,\n\
\ \"acc_norm_stderr\": 0.0330364656053113,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920626,\n \"mc2\": 0.4695906948194394,\n\
\ \"mc2_stderr\": 0.01494642651529255\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6269667396932882,\n\
\ \"acc_stderr\": 0.004826224784850442,\n \"acc_norm\": 0.8293168691495718,\n\
\ \"acc_norm_stderr\": 0.003754629313275163\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981762,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981762\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.02489246917246284,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.02489246917246284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473065,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391538,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391538\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814562,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814562\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n\
\ \"acc_stderr\": 0.015638440380241488,\n \"acc_norm\": 0.3229050279329609,\n\
\ \"acc_norm_stderr\": 0.015638440380241488\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920626,\n \"mc2\": 0.4695906948194394,\n\
\ \"mc2_stderr\": 0.01494642651529255\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/mistral-7b-platypus1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|arc:challenge|25_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hellaswag|10_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T12-58-49.551109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T12-58-49.551109.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T12-58-49.551109.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T12-58-49.551109.parquet'
- config_name: results
data_files:
- split: 2023_10_11T12_58_49.551109
path:
- results_2023-10-11T12-58-49.551109.parquet
- split: latest
path:
- results_2023-10-11T12-58-49.551109.parquet
---
# Dataset Card for Evaluation run of lgaalves/mistral-7b-platypus1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/mistral-7b-platypus1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/mistral-7b-platypus1k](https://huggingface.co/lgaalves/mistral-7b-platypus1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__mistral-7b-platypus1k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T12:58:49.551109](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__mistral-7b-platypus1k/blob/main/results_2023-10-11T12-58-49.551109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304998086039316,
"acc_stderr": 0.033058893340663746,
"acc_norm": 0.6346959437303671,
"acc_norm_stderr": 0.0330364656053113,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920626,
"mc2": 0.4695906948194394,
"mc2_stderr": 0.01494642651529255
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6269667396932882,
"acc_stderr": 0.004826224784850442,
"acc_norm": 0.8293168691495718,
"acc_norm_stderr": 0.003754629313275163
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981762,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981762
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246284,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473065,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391538,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391538
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814562,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814562
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.015638440380241488,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.015638440380241488
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825362,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825362
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920626,
"mc2": 0.4695906948194394,
"mc2_stderr": 0.01494642651529255
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,972 | [
[
-0.05010986328125,
-0.058319091796875,
0.019256591796875,
0.01505279541015625,
-0.01300048828125,
-0.004993438720703125,
0.0014905929565429688,
-0.0133209228515625,
0.039337158203125,
-0.0020694732666015625,
-0.034454345703125,
-0.04833984375,
-0.031112670898437... |
wlaminack/Nonlinearltestingdataset | 2023-10-11T13:34:00.000Z | [
"license:apache-2.0",
"region:us"
] | wlaminack | null | null | 0 | 0 | 2023-10-11T13:16:38 | ---
license: apache-2.0
---
def basic(array1):
x=(array1[0]-.5)
y=(array1[1]-.5)
z=(array1[2]-.5)
t=(array1[3]-.5)
r2=x*x+y*y+z*z+t*t
return 3*np.sin(r2)+np.random.random()*array1[4]
f=np.apply_along_axis(basic, 1, a) | 241 | [
[
-0.0016880035400390625,
-0.034088134765625,
0.053863525390625,
0.048583984375,
-0.004486083984375,
-0.01517486572265625,
0.0243682861328125,
-0.012969970703125,
0.03826904296875,
0.0396728515625,
-0.05474853515625,
-0.03533935546875,
-0.026458740234375,
-0.0... |
nehasingh555/genai-training-dataset | 2023-10-11T13:48:28.000Z | [
"region:us"
] | nehasingh555 | null | null | 0 | 0 | 2023-10-11T13:42:20 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
TrainingDataPro/aggressive-behavior-video-classification | 2023-10-11T13:52:29.000Z | [
"task_categories:video-classification",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"legal",
"region:us"
] | TrainingDataPro | null | null | 0 | 0 | 2023-10-11T13:50:17 | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
language:
- en
tags:
- code
- legal
---
# Aggressive Behavior Video Classification
## WARNING: People in the videos exhibit aggressive behavior
The dataset with videos depicting people exhibiting **aggressive and non-aggressive behavior** is intended for classification purposes. It consists of a collection of video files that capture various individuals engaging in different activities and displaying distinct behavioral patterns and CSV-file with classification.
**Aggressive Behavior Video Classification Dataset** can have multiple applications, such as surveillance systems, security modules, or social behavior analysis platforms.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=aggressive-behavior-video-classification) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
The dataset consists of:
- **files**: folder with videos with people exhibiting aggressive and non-aggressive behaviour (subfolders "aggressive" and "non_aggressive" respectively),
- **.csv file**: path of each video in the **"files"** folder and classification of the behavoir
# People Behavior Video Classification might be made in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=aggressive-behavior-video-classification)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** | 2,031 | [
[
-0.010162353515625,
-0.073974609375,
-0.02264404296875,
0.0211639404296875,
-0.006389617919921875,
0.01611328125,
-0.01140594482421875,
-0.01110076904296875,
0.0214691162109375,
0.0278472900390625,
-0.053680419921875,
-0.05108642578125,
-0.06378173828125,
-0... |
vietlegalqa/vi_wh_questions | 2023-10-11T14:25:04.000Z | [
"region:us"
] | vietlegalqa | null | null | 0 | 0 | 2023-10-11T14:24:57 | ---
dataset_info:
features:
- name: Wh_question
dtype: string
- name: Meaning
dtype: string
- name: Place
dtype: string
- name: Coarse
dtype: string
- name: Fine_grained
dtype: string
splits:
- name: train
num_bytes: 1585
num_examples: 13
download_size: 3390
dataset_size: 1585
---
# Dataset Card for "vi_wh_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 499 | [
[
-0.052154541015625,
-0.043487548828125,
0.018707275390625,
0.0018968582153320312,
-0.01129150390625,
-0.012664794921875,
0.018524169921875,
-0.0110626220703125,
0.06121826171875,
0.038360595703125,
-0.06396484375,
-0.04376220703125,
-0.0248565673828125,
-0.0... |
stany9g/4front.ai | 2023-10-11T14:33:12.000Z | [
"region:us"
] | stany9g | null | null | 0 | 0 | 2023-10-11T14:32:23 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
TiborUdvari/processing-project | 2023-11-01T14:41:10.000Z | [
"region:us"
] | TiborUdvari | null | null | 0 | 0 | 2023-10-11T14:35:39 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
nalmeida/agile_dataset | 2023-10-11T14:44:52.000Z | [
"region:us"
] | nalmeida | null | null | 0 | 0 | 2023-10-11T14:44:50 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2950354
num_examples: 25990
download_size: 613065
dataset_size: 2950354
---
# Dataset Card for "agile_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 510 | [
[
-0.03265380859375,
-0.0164947509765625,
0.0087432861328125,
0.032196044921875,
-0.0079803466796875,
0.01468658447265625,
0.01593017578125,
-0.0231475830078125,
0.052276611328125,
0.01715087890625,
-0.0599365234375,
-0.05218505859375,
-0.031494140625,
-0.0204... |
open-llm-leaderboard/details_AA051610__T1B | 2023-10-11T14:49:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T14:48:15 | ---
pretty_name: Evaluation run of AA051610/T1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/T1B](https://huggingface.co/AA051610/T1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__T1B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T14:47:52.551958](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1B/blob/main/results_2023-10-11T14-47-52.551958.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5990888068369461,\n\
\ \"acc_stderr\": 0.03431005125414193,\n \"acc_norm\": 0.6027895245963486,\n\
\ \"acc_norm_stderr\": 0.03429409520845181,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4701781470729953,\n\
\ \"mc2_stderr\": 0.014777434418052576\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294321,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.611929894443338,\n\
\ \"acc_stderr\": 0.004863147544177516,\n \"acc_norm\": 0.7978490340569607,\n\
\ \"acc_norm_stderr\": 0.004007834585541846\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n\
\ \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411905,\n \"\
acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411905\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.039036986477484395,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.039036986477484395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.040348466786033974,\n \"\
acc_norm\": 0.423841059602649,\n \"acc_norm_stderr\": 0.040348466786033974\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.01765871059444313,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.01765871059444313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.0403931497872456,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.0403931497872456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648372,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494583,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570765,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570765\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885996,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885996\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073056,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073056\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113503,\n \"mc2\": 0.4701781470729953,\n\
\ \"mc2_stderr\": 0.014777434418052576\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/T1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|arc:challenge|25_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hellaswag|10_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T14-47-52.551958.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T14-47-52.551958.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T14-47-52.551958.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T14-47-52.551958.parquet'
- config_name: results
data_files:
- split: 2023_10_11T14_47_52.551958
path:
- results_2023-10-11T14-47-52.551958.parquet
- split: latest
path:
- results_2023-10-11T14-47-52.551958.parquet
---
# Dataset Card for Evaluation run of AA051610/T1B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/T1B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/T1B](https://huggingface.co/AA051610/T1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__T1B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T14:47:52.551958](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1B/blob/main/results_2023-10-11T14-47-52.551958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5990888068369461,
"acc_stderr": 0.03431005125414193,
"acc_norm": 0.6027895245963486,
"acc_norm_stderr": 0.03429409520845181,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4701781470729953,
"mc2_stderr": 0.014777434418052576
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294321,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.611929894443338,
"acc_stderr": 0.004863147544177516,
"acc_norm": 0.7978490340569607,
"acc_norm_stderr": 0.004007834585541846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411905,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411905
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.039036986477484395,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.039036986477484395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.040348466786033974,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.040348466786033974
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.01765871059444313,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.01765871059444313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.0403931497872456,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.0403931497872456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648372,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494583,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570765,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885996,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885996
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073056,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073056
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113503,
"mc2": 0.4701781470729953,
"mc2_stderr": 0.014777434418052576
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,717 | [
[
-0.0504150390625,
-0.060455322265625,
0.0182037353515625,
0.0148468017578125,
-0.010467529296875,
-0.00466156005859375,
0.001224517822265625,
-0.0162811279296875,
0.040740966796875,
-0.004840850830078125,
-0.033966064453125,
-0.04815673828125,
-0.030517578125,
... |
ramchiluveru/MarketingMail | 2023-10-11T15:03:11.000Z | [
"region:us"
] | ramchiluveru | null | null | 0 | 0 | 2023-10-11T15:03:10 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 19321
num_examples: 10
download_size: 25230
dataset_size: 19321
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MarketingMail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 520 | [
[
-0.025054931640625,
-0.01003265380859375,
-0.00719451904296875,
0.0199127197265625,
-0.0004940032958984375,
-0.00461578369140625,
0.0127105712890625,
-0.0002715587615966797,
0.0595703125,
0.0406494140625,
-0.08038330078125,
-0.06097412109375,
-0.0294342041015625... |
renumics/spotlight-osunlp-MagicBrush-enrichment | 2023-10-11T15:04:54.000Z | [
"region:us"
] | renumics | null | null | 0 | 0 | 2023-10-11T15:04:49 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: img_id.embedding
sequence: float32
length: 2
- name: source_img.embedding
sequence: float32
length: 2
- name: mask_img.embedding
sequence: float32
length: 2
- name: instruction.embedding
sequence: float32
length: 2
- name: target_img.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 352280
num_examples: 8807
- name: dev
num_bytes: 21120
num_examples: 528
download_size: 524053
dataset_size: 373400
---
# Dataset Card for "spotlight-osunlp-MagicBrush-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 853 | [
[
-0.0426025390625,
-0.0191650390625,
0.0027790069580078125,
0.015411376953125,
-0.001708984375,
0.008636474609375,
0.00514984130859375,
-0.0198211669921875,
0.06011962890625,
0.041595458984375,
-0.0760498046875,
-0.034759521484375,
-0.035430908203125,
-0.0152... |
JuanJaramillo/samsum-sp | 2023-10-11T15:09:54.000Z | [
"region:us"
] | JuanJaramillo | null | null | 0 | 0 | 2023-10-11T15:08:55 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_AA051610__T2A | 2023-10-11T15:18:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T15:16:46 | ---
pretty_name: Evaluation run of AA051610/T2A
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/T2A](https://huggingface.co/AA051610/T2A) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__T2A\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T15:16:23.487044](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T2A/blob/main/results_2023-10-11T15-16-23.487044.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6173946376864352,\n\
\ \"acc_stderr\": 0.03337871209335492,\n \"acc_norm\": 0.6210628259045844,\n\
\ \"acc_norm_stderr\": 0.03336880945880684,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.47014420938426915,\n\
\ \"mc2_stderr\": 0.014571966148559557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370053,\n\
\ \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5524795857398924,\n\
\ \"acc_stderr\": 0.0049622205125483525,\n \"acc_norm\": 0.739892451702848,\n\
\ \"acc_norm_stderr\": 0.004377965074211627\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118634,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382175,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"\
acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"\
acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.042450224863844956,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.042450224863844956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.015445716910998874,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.015445716910998874\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826514,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826514\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553313,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412243,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412243\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.0162380650690596,\n \"mc2\": 0.47014420938426915,\n\
\ \"mc2_stderr\": 0.014571966148559557\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/T2A
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-16-23.487044.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-16-23.487044.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-16-23.487044.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-16-23.487044.parquet'
- config_name: results
data_files:
- split: 2023_10_11T15_16_23.487044
path:
- results_2023-10-11T15-16-23.487044.parquet
- split: latest
path:
- results_2023-10-11T15-16-23.487044.parquet
---
# Dataset Card for Evaluation run of AA051610/T2A
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/T2A
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/T2A](https://huggingface.co/AA051610/T2A) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__T2A",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T15:16:23.487044](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T2A/blob/main/results_2023-10-11T15-16-23.487044.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6173946376864352,
"acc_stderr": 0.03337871209335492,
"acc_norm": 0.6210628259045844,
"acc_norm_stderr": 0.03336880945880684,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.47014420938426915,
"mc2_stderr": 0.014571966148559557
},
"harness|arc:challenge|25": {
"acc": 0.4854948805460751,
"acc_stderr": 0.014605241081370053,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5524795857398924,
"acc_stderr": 0.0049622205125483525,
"acc_norm": 0.739892451702848,
"acc_norm_stderr": 0.004377965074211627
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118634,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382175,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.042450224863844956,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.042450224863844956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998874,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998874
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826514,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826514
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553313,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412243,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412243
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.0162380650690596,
"mc2": 0.47014420938426915,
"mc2_stderr": 0.014571966148559557
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,789 | [
[
-0.048492431640625,
-0.05859375,
0.020050048828125,
0.014862060546875,
-0.00952911376953125,
-0.002880096435546875,
0.002094268798828125,
-0.01629638671875,
0.040252685546875,
-0.005218505859375,
-0.03289794921875,
-0.047821044921875,
-0.0316162109375,
0.015... |
open-llm-leaderboard/details_AA051610__T1C | 2023-10-11T15:22:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T15:21:56 | ---
pretty_name: Evaluation run of AA051610/T1C
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/T1C](https://huggingface.co/AA051610/T1C) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__T1C\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T15:21:34.954726](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1C/blob/main/results_2023-10-11T15-21-34.954726.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5614045523007456,\n\
\ \"acc_stderr\": 0.034472805150990236,\n \"acc_norm\": 0.5650409022375938,\n\
\ \"acc_norm_stderr\": 0.03446466967324352,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4709897610921502,\n \"acc_stderr\": 0.014586776355294316,\n\
\ \"acc_norm\": 0.5017064846416383,\n \"acc_norm_stderr\": 0.01461130570505699\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5382393945429197,\n\
\ \"acc_stderr\": 0.004975167382061832,\n \"acc_norm\": 0.7220673172674766,\n\
\ \"acc_norm_stderr\": 0.004470644845242893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.043192236258113324,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.043192236258113324\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981748,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981748\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n\
\ \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n\
\ \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.502127659574468,\n \"acc_stderr\": 0.032685726586674915,\n \"\
acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572284,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.01536686038639711,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.01536686038639711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.02935491115994098,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.02935491115994098\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087548,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087548\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.42517178573631115,\n\
\ \"mc2_stderr\": 0.01461529390566251\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/T1C
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-21-34.954726.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-21-34.954726.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-21-34.954726.parquet'
- config_name: results
data_files:
- split: 2023_10_11T15_21_34.954726
path:
- results_2023-10-11T15-21-34.954726.parquet
- split: latest
path:
- results_2023-10-11T15-21-34.954726.parquet
---
# Dataset Card for Evaluation run of AA051610/T1C
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/T1C
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/T1C](https://huggingface.co/AA051610/T1C) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__T1C",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T15:21:34.954726](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__T1C/blob/main/results_2023-10-11T15-21-34.954726.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5614045523007456,
"acc_stderr": 0.034472805150990236,
"acc_norm": 0.5650409022375938,
"acc_norm_stderr": 0.03446466967324352,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
},
"harness|arc:challenge|25": {
"acc": 0.4709897610921502,
"acc_stderr": 0.014586776355294316,
"acc_norm": 0.5017064846416383,
"acc_norm_stderr": 0.01461130570505699
},
"harness|hellaswag|10": {
"acc": 0.5382393945429197,
"acc_stderr": 0.004975167382061832,
"acc_norm": 0.7220673172674766,
"acc_norm_stderr": 0.004470644845242893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.043192236258113324,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.043192236258113324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981748,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981748
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572284,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.01536686038639711,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.01536686038639711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.02935491115994098,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.02935491115994098
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087548,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087548
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.42517178573631115,
"mc2_stderr": 0.01461529390566251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,681 | [
[
-0.0494384765625,
-0.05877685546875,
0.0190887451171875,
0.01503753662109375,
-0.0108642578125,
-0.0044097900390625,
0.00205230712890625,
-0.0146331787109375,
0.041015625,
-0.004085540771484375,
-0.034637451171875,
-0.050567626953125,
-0.031982421875,
0.0165... |
open-llm-leaderboard/details_ehartford__samantha-1.2-mistral-7b | 2023-10-24T06:58:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 1 | 0 | 2023-10-11T15:46:51 | ---
pretty_name: Evaluation run of ehartford/samantha-1.2-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-1.2-mistral-7b](https://huggingface.co/ehartford/samantha-1.2-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-1.2-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T06:58:18.439243](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.2-mistral-7b/blob/main/results_2023-10-24T06-58-18.439243.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415926,\n \"f1\": 0.06134647651006727,\n\
\ \"f1_stderr\": 0.001402920930367906,\n \"acc\": 0.47757263909840575,\n\
\ \"acc_stderr\": 0.010941242547603296\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415926,\n\
\ \"f1\": 0.06134647651006727,\n \"f1_stderr\": 0.001402920930367906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \
\ \"acc_stderr\": 0.010342572360861202\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345393\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-1.2-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T06_58_18.439243
path:
- '**/details_harness|drop|3_2023-10-24T06-58-18.439243.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T06-58-18.439243.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T06_58_18.439243
path:
- '**/details_harness|gsm8k|5_2023-10-24T06-58-18.439243.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T06-58-18.439243.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-46-28.898359.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-46-28.898359.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T15-46-28.898359.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T06_58_18.439243
path:
- '**/details_harness|winogrande|5_2023-10-24T06-58-18.439243.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T06-58-18.439243.parquet'
- config_name: results
data_files:
- split: 2023_10_11T15_46_28.898359
path:
- results_2023-10-11T15-46-28.898359.parquet
- split: 2023_10_24T06_58_18.439243
path:
- results_2023-10-24T06-58-18.439243.parquet
- split: latest
path:
- results_2023-10-24T06-58-18.439243.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-1.2-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-1.2-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-1.2-mistral-7b](https://huggingface.co/ehartford/samantha-1.2-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-1.2-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T06:58:18.439243](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-1.2-mistral-7b/blob/main/results_2023-10-24T06-58-18.439243.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415926,
"f1": 0.06134647651006727,
"f1_stderr": 0.001402920930367906,
"acc": 0.47757263909840575,
"acc_stderr": 0.010941242547603296
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415926,
"f1": 0.06134647651006727,
"f1_stderr": 0.001402920930367906
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.010342572360861202
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345393
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,718 | [
[
-0.02325439453125,
-0.0469970703125,
0.01995849609375,
0.0108795166015625,
-0.01073455810546875,
0.0027008056640625,
-0.0207977294921875,
-0.01323699951171875,
0.0286407470703125,
0.04058837890625,
-0.05035400390625,
-0.07025146484375,
-0.051300048828125,
0.... |
Mai0313/Chess-AI-Database | 2023-10-11T17:53:59.000Z | [
"region:us"
] | Mai0313 | null | null | 0 | 0 | 2023-10-11T16:06:24 | # chess-AI-database
This is the main database for Chess AI
For more detail, please see [Chess-AI-Pytorch](https://github.com/Mai0313/Chess-AI-Pytorch)
| 151 | [
[
-0.0374755859375,
-0.05767822265625,
0.00823211669921875,
-0.00600433349609375,
-0.01409912109375,
0.0088958740234375,
-0.004993438720703125,
0.008056640625,
0.046630859375,
0.0408935546875,
-0.05316162109375,
-0.0689697265625,
-0.033782958984375,
-0.0141143... |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7 | 2023-10-11T16:10:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T16:09:54 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench7](https://huggingface.co/Undi95/Mistral-11B-TestBench7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T16:09:31.642289](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7/blob/main/results_2023-10-11T16-09-31.642289.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6399052867360159,\n\
\ \"acc_stderr\": 0.03310704632621164,\n \"acc_norm\": 0.6439213227226402,\n\
\ \"acc_norm_stderr\": 0.03308447285363473,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4691495265456508,\n\
\ \"mc2_stderr\": 0.014857248788144817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472432,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.63433578968333,\n \
\ \"acc_stderr\": 0.004806316342709402,\n \"acc_norm\": 0.8286197968532165,\n\
\ \"acc_norm_stderr\": 0.0037607069750393053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509986,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4691495265456508,\n\
\ \"mc2_stderr\": 0.014857248788144817\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-09-31.642289.parquet'
- config_name: results
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- results_2023-10-11T16-09-31.642289.parquet
- split: latest
path:
- results_2023-10-11T16-09-31.642289.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench7](https://huggingface.co/Undi95/Mistral-11B-TestBench7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T16:09:31.642289](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7/blob/main/results_2023-10-11T16-09-31.642289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6399052867360159,
"acc_stderr": 0.03310704632621164,
"acc_norm": 0.6439213227226402,
"acc_norm_stderr": 0.03308447285363473,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.4691495265456508,
"mc2_stderr": 0.014857248788144817
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472432,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.63433578968333,
"acc_stderr": 0.004806316342709402,
"acc_norm": 0.8286197968532165,
"acc_norm_stderr": 0.0037607069750393053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.4691495265456508,
"mc2_stderr": 0.014857248788144817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,971 | [
[
-0.05029296875,
-0.057830810546875,
0.018646240234375,
0.01393890380859375,
-0.0117645263671875,
-0.00496673583984375,
0.0013179779052734375,
-0.0135040283203125,
0.03692626953125,
-0.0037097930908203125,
-0.032623291015625,
-0.048065185546875,
-0.02925109863281... |
rishiraj/guanaco-style-metamath-40k | 2023-10-11T16:20:57.000Z | [
"region:us"
] | rishiraj | null | null | 1 | 0 | 2023-10-11T16:16:51 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
open-llm-leaderboard/details_adept__persimmon-8b-base | 2023-10-11T16:31:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T16:30:19 | ---
pretty_name: Evaluation run of adept/persimmon-8b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adept/persimmon-8b-base](https://huggingface.co/adept/persimmon-8b-base) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adept__persimmon-8b-base\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T16:30:00.730198](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-base/blob/main/results_2023-10-11T16-30-00.730198.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4373382174928584,\n\
\ \"acc_stderr\": 0.03537473296886481,\n \"acc_norm\": 0.440779620602171,\n\
\ \"acc_norm_stderr\": 0.03536781150443019,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.378505315070287,\n\
\ \"mc2_stderr\": 0.013586954257578736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41552901023890787,\n \"acc_stderr\": 0.014401366641216384,\n\
\ \"acc_norm\": 0.4274744027303754,\n \"acc_norm_stderr\": 0.014456862944650652\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5203146783509262,\n\
\ \"acc_stderr\": 0.004985661282998582,\n \"acc_norm\": 0.7114120693089027,\n\
\ \"acc_norm_stderr\": 0.004521798577922143\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296559,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296559\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467512,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467512\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325642,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4838709677419355,\n \"acc_stderr\": 0.028429203176724555,\n \"\
acc_norm\": 0.4838709677419355,\n \"acc_norm_stderr\": 0.028429203176724555\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5181347150259067,\n \"acc_stderr\": 0.036060650018329185,\n\
\ \"acc_norm\": 0.5181347150259067,\n \"acc_norm_stderr\": 0.036060650018329185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.02478431694215638,\n\
\ \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.02478431694215638\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5321100917431193,\n \"acc_stderr\": 0.021393071222680797,\n \"\
acc_norm\": 0.5321100917431193,\n \"acc_norm_stderr\": 0.021393071222680797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.03070137211151094,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.03070137211151094\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5882352941176471,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \
\ \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n\
\ \"acc_stderr\": 0.017802087135850304,\n \"acc_norm\": 0.5466155810983397,\n\
\ \"acc_norm_stderr\": 0.017802087135850304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4653179190751445,\n \"acc_stderr\": 0.0268542579282589,\n\
\ \"acc_norm\": 0.4653179190751445,\n \"acc_norm_stderr\": 0.0268542579282589\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925296,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925296\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.027756535257347666,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.027756535257347666\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3344198174706649,\n\
\ \"acc_stderr\": 0.012049668983214933,\n \"acc_norm\": 0.3344198174706649,\n\
\ \"acc_norm_stderr\": 0.012049668983214933\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.38562091503267976,\n \"acc_stderr\": 0.01969145905235415,\n \
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.01969145905235415\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.03762738699917057,\n\
\ \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.03762738699917057\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.378505315070287,\n\
\ \"mc2_stderr\": 0.013586954257578736\n }\n}\n```"
repo_url: https://huggingface.co/adept/persimmon-8b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-30-00.730198.parquet'
- config_name: results
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- results_2023-10-11T16-30-00.730198.parquet
- split: latest
path:
- results_2023-10-11T16-30-00.730198.parquet
---
# Dataset Card for Evaluation run of adept/persimmon-8b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adept/persimmon-8b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adept/persimmon-8b-base](https://huggingface.co/adept/persimmon-8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adept__persimmon-8b-base",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T16:30:00.730198](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-base/blob/main/results_2023-10-11T16-30-00.730198.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4373382174928584,
"acc_stderr": 0.03537473296886481,
"acc_norm": 0.440779620602171,
"acc_norm_stderr": 0.03536781150443019,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.378505315070287,
"mc2_stderr": 0.013586954257578736
},
"harness|arc:challenge|25": {
"acc": 0.41552901023890787,
"acc_stderr": 0.014401366641216384,
"acc_norm": 0.4274744027303754,
"acc_norm_stderr": 0.014456862944650652
},
"harness|hellaswag|10": {
"acc": 0.5203146783509262,
"acc_stderr": 0.004985661282998582,
"acc_norm": 0.7114120693089027,
"acc_norm_stderr": 0.004521798577922143
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296559,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296559
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.030533338430467512,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.030533338430467512
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325642,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5181347150259067,
"acc_stderr": 0.036060650018329185,
"acc_norm": 0.5181347150259067,
"acc_norm_stderr": 0.036060650018329185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.39487179487179486,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.39487179487179486,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5321100917431193,
"acc_stderr": 0.021393071222680797,
"acc_norm": 0.5321100917431193,
"acc_norm_stderr": 0.021393071222680797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.03070137211151094,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.03070137211151094
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.032335327775334835,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.032335327775334835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5466155810983397,
"acc_stderr": 0.017802087135850304,
"acc_norm": 0.5466155810983397,
"acc_norm_stderr": 0.017802087135850304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4653179190751445,
"acc_stderr": 0.0268542579282589,
"acc_norm": 0.4653179190751445,
"acc_norm_stderr": 0.0268542579282589
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925296,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925296
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3344198174706649,
"acc_stderr": 0.012049668983214933,
"acc_norm": 0.3344198174706649,
"acc_norm_stderr": 0.012049668983214933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.01969145905235415,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.01969145905235415
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.03762738699917057,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.03762738699917057
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.378505315070287,
"mc2_stderr": 0.013586954257578736
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,906 | [
[
-0.0482177734375,
-0.05560302734375,
0.0197601318359375,
0.01442718505859375,
-0.01154327392578125,
-0.005016326904296875,
0.001911163330078125,
-0.01155853271484375,
0.03790283203125,
-0.001873016357421875,
-0.03375244140625,
-0.051025390625,
-0.031951904296875... |
open-llm-leaderboard/details_adept__persimmon-8b-chat | 2023-10-11T16:43:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T16:42:44 | ---
pretty_name: Evaluation run of adept/persimmon-8b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adept/persimmon-8b-chat](https://huggingface.co/adept/persimmon-8b-chat) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adept__persimmon-8b-chat\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T16:42:26.599502](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-chat/blob/main/results_2023-10-11T16-42-26.599502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.451177873431513,\n\
\ \"acc_stderr\": 0.035194823330966046,\n \"acc_norm\": 0.45457780958443955,\n\
\ \"acc_norm_stderr\": 0.03518601630486001,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396736,\n \"mc2\": 0.35928086491565836,\n\
\ \"mc2_stderr\": 0.013539847732342817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43856655290102387,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.4496587030716723,\n \"acc_norm_stderr\": 0.014537144444284738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5435172276438957,\n\
\ \"acc_stderr\": 0.0049708466975523094,\n \"acc_norm\": 0.7330213104959171,\n\
\ \"acc_norm_stderr\": 0.004414770331224652\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.0233306540545359,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.0233306540545359\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5225806451612903,\n \"acc_stderr\": 0.028414985019707868,\n \"\
acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.028414985019707868\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563917,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563917\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5854922279792746,\n \"acc_stderr\": 0.035553003195576686,\n\
\ \"acc_norm\": 0.5854922279792746,\n \"acc_norm_stderr\": 0.035553003195576686\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804726,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804726\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5596330275229358,\n \"acc_stderr\": 0.02128431062376155,\n \"\
acc_norm\": 0.5596330275229358,\n \"acc_norm_stderr\": 0.02128431062376155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953174,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.553921568627451,\n \"acc_stderr\": 0.034888454513049734,\n \"\
acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.034888454513049734\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.033378837362550984,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.033378837362550984\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5747126436781609,\n\
\ \"acc_stderr\": 0.01767922548943145,\n \"acc_norm\": 0.5747126436781609,\n\
\ \"acc_norm_stderr\": 0.01767922548943145\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4477124183006536,\n \"acc_stderr\": 0.028472938478033522,\n\
\ \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.028472938478033522\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4790996784565916,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.4790996784565916,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.027731022753539277,\n\
\ \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.027731022753539277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32978723404255317,\n \"acc_stderr\": 0.028045946942042405,\n \
\ \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.028045946942042405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3546284224250326,\n\
\ \"acc_stderr\": 0.01221857643909016,\n \"acc_norm\": 0.3546284224250326,\n\
\ \"acc_norm_stderr\": 0.01221857643909016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.020054269200726452,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.020054269200726452\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065685,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065685\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.03753638955761691,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.03753638955761691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396736,\n \"mc2\": 0.35928086491565836,\n\
\ \"mc2_stderr\": 0.013539847732342817\n }\n}\n```"
repo_url: https://huggingface.co/adept/persimmon-8b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-42-26.599502.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-42-26.599502.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-42-26.599502.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-42-26.599502.parquet'
- config_name: results
data_files:
- split: 2023_10_11T16_42_26.599502
path:
- results_2023-10-11T16-42-26.599502.parquet
- split: latest
path:
- results_2023-10-11T16-42-26.599502.parquet
---
# Dataset Card for Evaluation run of adept/persimmon-8b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adept/persimmon-8b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adept/persimmon-8b-chat](https://huggingface.co/adept/persimmon-8b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adept__persimmon-8b-chat",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T16:42:26.599502](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-chat/blob/main/results_2023-10-11T16-42-26.599502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.451177873431513,
"acc_stderr": 0.035194823330966046,
"acc_norm": 0.45457780958443955,
"acc_norm_stderr": 0.03518601630486001,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396736,
"mc2": 0.35928086491565836,
"mc2_stderr": 0.013539847732342817
},
"harness|arc:challenge|25": {
"acc": 0.43856655290102387,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.4496587030716723,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.5435172276438957,
"acc_stderr": 0.0049708466975523094,
"acc_norm": 0.7330213104959171,
"acc_norm_stderr": 0.004414770331224652
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.0233306540545359,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.0233306540545359
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563917,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563917
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5854922279792746,
"acc_stderr": 0.035553003195576686,
"acc_norm": 0.5854922279792746,
"acc_norm_stderr": 0.035553003195576686
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804726,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804726
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5596330275229358,
"acc_stderr": 0.02128431062376155,
"acc_norm": 0.5596330275229358,
"acc_norm_stderr": 0.02128431062376155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953174,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.034888454513049734,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.034888454513049734
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.033378837362550984,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.033378837362550984
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5747126436781609,
"acc_stderr": 0.01767922548943145,
"acc_norm": 0.5747126436781609,
"acc_norm_stderr": 0.01767922548943145
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4477124183006536,
"acc_stderr": 0.028472938478033522,
"acc_norm": 0.4477124183006536,
"acc_norm_stderr": 0.028472938478033522
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4790996784565916,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.4790996784565916,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.027731022753539277,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.027731022753539277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.028045946942042405,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.028045946942042405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3546284224250326,
"acc_stderr": 0.01221857643909016,
"acc_norm": 0.3546284224250326,
"acc_norm_stderr": 0.01221857643909016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.020054269200726452,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.020054269200726452
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065685,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065685
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.03753638955761691,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.03753638955761691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396736,
"mc2": 0.35928086491565836,
"mc2_stderr": 0.013539847732342817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,967 | [
[
-0.048828125,
-0.05889892578125,
0.0192718505859375,
0.01486968994140625,
-0.010467529296875,
-0.0050811767578125,
0.0016841888427734375,
-0.01200103759765625,
0.038055419921875,
-0.0021381378173828125,
-0.03350830078125,
-0.050048828125,
-0.031494140625,
0.... |
Rutson/GirlsGeneration | 2023-10-11T17:06:00.000Z | [
"region:us"
] | Rutson | null | null | 0 | 0 | 2023-10-11T17:06:00 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9 | 2023-10-11T17:39:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T17:38:44 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench9](https://huggingface.co/Undi95/Mistral-11B-TestBench9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T17:38:21.379151](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9/blob/main/results_2023-10-11T17-38-21.379151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6398621215363215,\n\
\ \"acc_stderr\": 0.033170910986947626,\n \"acc_norm\": 0.6434278880715447,\n\
\ \"acc_norm_stderr\": 0.03314925860793652,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5618804562751369,\n\
\ \"mc2_stderr\": 0.015525700835296153\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.01402022415583916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\
\ \"acc_stderr\": 0.004752158936871871,\n \"acc_norm\": 0.8423620792670783,\n\
\ \"acc_norm_stderr\": 0.0036365642863526765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973138,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n\
\ \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n\
\ \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358978,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358978\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5618804562751369,\n\
\ \"mc2_stderr\": 0.015525700835296153\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-38-21.379151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-38-21.379151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-38-21.379151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-38-21.379151.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_38_21.379151
path:
- results_2023-10-11T17-38-21.379151.parquet
- split: latest
path:
- results_2023-10-11T17-38-21.379151.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench9
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench9
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench9](https://huggingface.co/Undi95/Mistral-11B-TestBench9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T17:38:21.379151](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9/blob/main/results_2023-10-11T17-38-21.379151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6398621215363215,
"acc_stderr": 0.033170910986947626,
"acc_norm": 0.6434278880715447,
"acc_norm_stderr": 0.03314925860793652,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5618804562751369,
"mc2_stderr": 0.015525700835296153
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.01402022415583916
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871871,
"acc_norm": 0.8423620792670783,
"acc_norm_stderr": 0.0036365642863526765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973138,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.016449708209026078,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.016449708209026078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358978,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358978
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5618804562751369,
"mc2_stderr": 0.015525700835296153
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,923 | [
[
-0.0501708984375,
-0.05859375,
0.0183563232421875,
0.01401519775390625,
-0.01033782958984375,
-0.0044403076171875,
0.0017423629760742188,
-0.0130157470703125,
0.03717041015625,
-0.0037631988525390625,
-0.03350830078125,
-0.0479736328125,
-0.0294342041015625,
... |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080091 | 2023-10-24T19:37:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T17:39:29 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080091\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T19:36:48.099081](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080091/blob/main/results_2023-10-24T19-36-48.099081.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004718959731543624,\n\
\ \"em_stderr\": 0.0007018360183131064,\n \"f1\": 0.06776950503355726,\n\
\ \"f1_stderr\": 0.00157312938548866,\n \"acc\": 0.4068737946340684,\n\
\ \"acc_stderr\": 0.00984774370435679\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131064,\n\
\ \"f1\": 0.06776950503355726,\n \"f1_stderr\": 0.00157312938548866\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07657316148597422,\n \
\ \"acc_stderr\": 0.007324564881451574\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262008\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T19_36_48.099081
path:
- '**/details_harness|drop|3_2023-10-24T19-36-48.099081.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T19-36-48.099081.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T19_36_48.099081
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-36-48.099081.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-36-48.099081.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-39-05.539335.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-39-05.539335.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-39-05.539335.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T19_36_48.099081
path:
- '**/details_harness|winogrande|5_2023-10-24T19-36-48.099081.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T19-36-48.099081.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_39_05.539335
path:
- results_2023-10-11T17-39-05.539335.parquet
- split: 2023_10_24T19_36_48.099081
path:
- results_2023-10-24T19-36-48.099081.parquet
- split: latest
path:
- results_2023-10-24T19-36-48.099081.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080091) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080091",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:36:48.099081](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080091/blob/main/results_2023-10-24T19-36-48.099081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131064,
"f1": 0.06776950503355726,
"f1_stderr": 0.00157312938548866,
"acc": 0.4068737946340684,
"acc_stderr": 0.00984774370435679
},
"harness|drop|3": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131064,
"f1": 0.06776950503355726,
"f1_stderr": 0.00157312938548866
},
"harness|gsm8k|5": {
"acc": 0.07657316148597422,
"acc_stderr": 0.007324564881451574
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262008
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,926 | [
[
-0.0267486572265625,
-0.05413818359375,
0.0171661376953125,
0.0201263427734375,
-0.0148162841796875,
0.0034809112548828125,
-0.02276611328125,
-0.0166778564453125,
0.034881591796875,
0.044830322265625,
-0.048004150390625,
-0.0694580078125,
-0.0426025390625,
... |
open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080082 | 2023-10-23T18:02:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T17:45:48 | ---
pretty_name: Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080082\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T18:02:30.843384](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080082/blob/main/results_2023-10-23T18-02-30.843384.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004718959731543624,\n\
\ \"em_stderr\": 0.0007018360183131064,\n \"f1\": 0.06889890939597328,\n\
\ \"f1_stderr\": 0.0015900969200350048,\n \"acc\": 0.4076319447477909,\n\
\ \"acc_stderr\": 0.009880788504185114\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131064,\n\
\ \"f1\": 0.06889890939597328,\n \"f1_stderr\": 0.0015900969200350048\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.007390654481108218\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262008\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T18_02_30.843384
path:
- '**/details_harness|drop|3_2023-10-23T18-02-30.843384.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T18-02-30.843384.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T18_02_30.843384
path:
- '**/details_harness|gsm8k|5_2023-10-23T18-02-30.843384.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T18-02-30.843384.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-45-25.017539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-45-25.017539.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-45-25.017539.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T18_02_30.843384
path:
- '**/details_harness|winogrande|5_2023-10-23T18-02-30.843384.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T18-02-30.843384.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_45_25.017539
path:
- results_2023-10-11T17-45-25.017539.parquet
- split: 2023_10_23T18_02_30.843384
path:
- results_2023-10-23T18-02-30.843384.parquet
- split: latest
path:
- results_2023-10-23T18-02-30.843384.parquet
---
# Dataset Card for Evaluation run of Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082](https://huggingface.co/Charlie911/vicuna-7b-v1.5-lora-timedial-unit-080082) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080082",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T18:02:30.843384](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__vicuna-7b-v1.5-lora-timedial-unit-080082/blob/main/results_2023-10-23T18-02-30.843384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131064,
"f1": 0.06889890939597328,
"f1_stderr": 0.0015900969200350048,
"acc": 0.4076319447477909,
"acc_stderr": 0.009880788504185114
},
"harness|drop|3": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131064,
"f1": 0.06889890939597328,
"f1_stderr": 0.0015900969200350048
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108218
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262008
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,936 | [
[
-0.0265960693359375,
-0.053802490234375,
0.01727294921875,
0.0207061767578125,
-0.0147247314453125,
0.0035552978515625,
-0.0231170654296875,
-0.017059326171875,
0.034912109375,
0.044769287109375,
-0.048126220703125,
-0.0689697265625,
-0.04290771484375,
0.010... |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13.1 | 2023-10-27T07:59:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T17:55:36 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mistral-7b-v13.1](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T07:58:51.376578](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13.1/blob/main/results_2023-10-27T07-58-51.376578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28953439597315433,\n\
\ \"em_stderr\": 0.004644738434561709,\n \"f1\": 0.3589429530201351,\n\
\ \"f1_stderr\": 0.004562237952673667,\n \"acc\": 0.401525754664538,\n\
\ \"acc_stderr\": 0.010223042101778138\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.28953439597315433,\n \"em_stderr\": 0.004644738434561709,\n\
\ \"f1\": 0.3589429530201351,\n \"f1_stderr\": 0.004562237952673667\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \
\ \"acc_stderr\": 0.007770691416783547\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.01267539278677273\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T07_58_51.376578
path:
- '**/details_harness|drop|3_2023-10-27T07-58-51.376578.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T07-58-51.376578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T07_58_51.376578
path:
- '**/details_harness|gsm8k|5_2023-10-27T07-58-51.376578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T07-58-51.376578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-55-12.784881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-55-12.784881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-55-12.784881.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T07_58_51.376578
path:
- '**/details_harness|winogrande|5_2023-10-27T07-58-51.376578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T07-58-51.376578.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_55_12.784881
path:
- results_2023-10-11T17-55-12.784881.parquet
- split: 2023_10_27T07_58_51.376578
path:
- results_2023-10-27T07-58-51.376578.parquet
- split: latest
path:
- results_2023-10-27T07-58-51.376578.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v13.1](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T07:58:51.376578](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13.1/blob/main/results_2023-10-27T07-58-51.376578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28953439597315433,
"em_stderr": 0.004644738434561709,
"f1": 0.3589429530201351,
"f1_stderr": 0.004562237952673667,
"acc": 0.401525754664538,
"acc_stderr": 0.010223042101778138
},
"harness|drop|3": {
"em": 0.28953439597315433,
"em_stderr": 0.004644738434561709,
"f1": 0.3589429530201351,
"f1_stderr": 0.004562237952673667
},
"harness|gsm8k|5": {
"acc": 0.08718726307808947,
"acc_stderr": 0.007770691416783547
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.01267539278677273
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,736 | [
[
-0.0296630859375,
-0.05279541015625,
0.0085906982421875,
0.0172119140625,
-0.0088653564453125,
-0.001323699951171875,
-0.026123046875,
-0.011444091796875,
0.019775390625,
0.038055419921875,
-0.039825439453125,
-0.06695556640625,
-0.04473876953125,
0.00756454... |
csupiisc/plmn2.5l | 2023-10-11T17:56:26.000Z | [
"region:us"
] | csupiisc | null | null | 0 | 0 | 2023-10-11T17:56:24 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 753808
num_examples: 10000
download_size: 299024
dataset_size: 753808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "plmn2.5l"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 434 | [
[
-0.05029296875,
0.0014753341674804688,
0.01044464111328125,
0.02655029296875,
-0.0295562744140625,
-0.00878143310546875,
0.0263824462890625,
-0.001861572265625,
0.0285797119140625,
0.04693603515625,
-0.058685302734375,
-0.0533447265625,
-0.037872314453125,
-... |
csupiisc/plmn3.5l | 2023-10-11T17:56:28.000Z | [
"region:us"
] | csupiisc | null | null | 0 | 0 | 2023-10-11T17:56:26 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 754449
num_examples: 10000
download_size: 300127
dataset_size: 754449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "plmn3.5l"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 434 | [
[
-0.057525634765625,
0.0016574859619140625,
0.0197601318359375,
0.03131103515625,
-0.0254364013671875,
-0.00913238525390625,
0.0296478271484375,
-0.0027866363525390625,
0.0323486328125,
0.0506591796875,
-0.05926513671875,
-0.0633544921875,
-0.032684326171875,
... |
open-llm-leaderboard/details_jphme__em_german_leo_mistral | 2023-10-26T05:36:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T17:57:58 | ---
pretty_name: Evaluation run of jphme/em_german_leo_mistral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jphme/em_german_leo_mistral](https://huggingface.co/jphme/em_german_leo_mistral)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jphme__em_german_leo_mistral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T05:35:49.227572](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__em_german_leo_mistral/blob/main/results_2023-10-26T05-35-49.227572.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2305998322147651,\n\
\ \"em_stderr\": 0.004313653760724557,\n \"f1\": 0.2864733640939601,\n\
\ \"f1_stderr\": 0.004317447810452205,\n \"acc\": 0.3954548691248602,\n\
\ \"acc_stderr\": 0.009372608948757369\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2305998322147651,\n \"em_stderr\": 0.004313653760724557,\n\
\ \"f1\": 0.2864733640939601,\n \"f1_stderr\": 0.004317447810452205\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.056103108415466264,\n \
\ \"acc_stderr\": 0.00633866843132188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192858\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jphme/em_german_leo_mistral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|drop|3_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T05-35-49.227572.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T05-35-49.227572.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-57-34.404631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T17-57-34.404631.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T05_35_49.227572
path:
- '**/details_harness|winogrande|5_2023-10-26T05-35-49.227572.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T05-35-49.227572.parquet'
- config_name: results
data_files:
- split: 2023_10_11T17_57_34.404631
path:
- results_2023-10-11T17-57-34.404631.parquet
- split: 2023_10_26T05_35_49.227572
path:
- results_2023-10-26T05-35-49.227572.parquet
- split: latest
path:
- results_2023-10-26T05-35-49.227572.parquet
---
# Dataset Card for Evaluation run of jphme/em_german_leo_mistral
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jphme/em_german_leo_mistral
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jphme/em_german_leo_mistral](https://huggingface.co/jphme/em_german_leo_mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jphme__em_german_leo_mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T05:35:49.227572](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__em_german_leo_mistral/blob/main/results_2023-10-26T05-35-49.227572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2305998322147651,
"em_stderr": 0.004313653760724557,
"f1": 0.2864733640939601,
"f1_stderr": 0.004317447810452205,
"acc": 0.3954548691248602,
"acc_stderr": 0.009372608948757369
},
"harness|drop|3": {
"em": 0.2305998322147651,
"em_stderr": 0.004313653760724557,
"f1": 0.2864733640939601,
"f1_stderr": 0.004317447810452205
},
"harness|gsm8k|5": {
"acc": 0.056103108415466264,
"acc_stderr": 0.00633866843132188
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192858
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,628 | [
[
-0.03314208984375,
-0.049591064453125,
0.01262664794921875,
0.015777587890625,
-0.01096343994140625,
0.001644134521484375,
-0.0304412841796875,
-0.0139312744140625,
0.0296630859375,
0.037445068359375,
-0.051971435546875,
-0.0706787109375,
-0.05059814453125,
... |
Kyle1668/BOSS-Robustness-Benchmark | 2023-10-11T18:34:16.000Z | [
"region:us"
] | Kyle1668 | null | null | 0 | 0 | 2023-10-11T18:17:05 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
dengandong/SportsQA_FineGym | 2023-10-11T18:52:59.000Z | [
"region:us"
] | dengandong | null | null | 0 | 0 | 2023-10-11T18:29:56 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
mr-jurak1/yad | 2023-10-11T18:43:26.000Z | [
"region:us"
] | mr-jurak1 | null | null | 0 | 0 | 2023-10-11T18:43:26 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
tierdesafinante/jett | 2023-10-11T18:50:11.000Z | [
"region:us"
] | tierdesafinante | null | null | 0 | 0 | 2023-10-11T18:50:11 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.016998291015625,
-0.05206298828125,
-0.01496124267578125,
-0.06036376953125,
0.0379... |
ostapeno/platy_icl5_prmt00_maxD50_3 | 2023-10-11T19:09:33.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-11T19:09:21 | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: docno
dtype: string
- name: subject
dtype: string
- name: icl_examples
sequence: string
- name: instruction
dtype: string
- name: author_instr
dtype: string
- name: response
dtype: string
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: formal_logic
num_bytes: 543627.0481283423
num_examples: 108
- name: machine_learning
num_bytes: 583895.7183600713
num_examples: 116
- name: global_facts
num_bytes: 734903.2317290553
num_examples: 146
- name: abstract_algebra
num_bytes: 372485.19964349375
num_examples: 74
- name: high_school_physics
num_bytes: 785239.0695187165
num_examples: 156
- name: college_biology
num_bytes: 488257.6265597148
num_examples: 97
- name: high_school_government_and_politics
num_bytes: 427854.6212121212
num_examples: 85
- name: prehistory
num_bytes: 568794.9670231729
num_examples: 113
- name: security_studies
num_bytes: 468123.29144385026
num_examples: 93
- name: sociology
num_bytes: 674500.2263814617
num_examples: 134
download_size: 2208464
dataset_size: 5647681.0
---
# Dataset Card for "platy_icl5_subset1.0_maxD50_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 1,502 | [
[
-0.0528564453125,
0.0087890625,
0.01219940185546875,
0.038238525390625,
-0.021087646484375,
0.0016193389892578125,
0.0234527587890625,
0.0029468536376953125,
0.038238525390625,
0.04931640625,
-0.0533447265625,
-0.06744384765625,
-0.039886474609375,
0.0044517... |
DangFutures/EUAI | 2023-10-11T20:13:07.000Z | [
"region:us"
] | DangFutures | null | null | 0 | 0 | 2023-10-11T19:39:59 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
Lexington120/test2 | 2023-10-11T19:40:24.000Z | [
"region:us"
] | Lexington120 | null | null | 0 | 0 | 2023-10-11T19:40:24 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.016998291015625,
-0.05206298828125,
-0.01496124267578125,
-0.06036376953125,
0.0379... |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11 | 2023-10-28T01:59:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T20:08:59 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench11
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench11](https://huggingface.co/Undi95/Mistral-11B-TestBench11)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T01:59:23.177639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11/blob/main/results_2023-10-28T01-59-23.177639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02904781879194631,\n\
\ \"em_stderr\": 0.0017198688690203193,\n \"f1\": 0.09573615771812093,\n\
\ \"f1_stderr\": 0.0021674728464020697,\n \"acc\": 0.463391282649971,\n\
\ \"acc_stderr\": 0.010754512266719978\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02904781879194631,\n \"em_stderr\": 0.0017198688690203193,\n\
\ \"f1\": 0.09573615771812093,\n \"f1_stderr\": 0.0021674728464020697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.00981809072372729\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench11
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|drop|3_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T01-59-23.177639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|gsm8k|5_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T01-59-23.177639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|winogrande|5_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T01-59-23.177639.parquet'
- config_name: results
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- results_2023-10-11T20-08-34.702863.parquet
- split: 2023_10_28T01_59_23.177639
path:
- results_2023-10-28T01-59-23.177639.parquet
- split: latest
path:
- results_2023-10-28T01-59-23.177639.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench11
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench11
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench11](https://huggingface.co/Undi95/Mistral-11B-TestBench11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T01:59:23.177639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11/blob/main/results_2023-10-28T01-59-23.177639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02904781879194631,
"em_stderr": 0.0017198688690203193,
"f1": 0.09573615771812093,
"f1_stderr": 0.0021674728464020697,
"acc": 0.463391282649971,
"acc_stderr": 0.010754512266719978
},
"harness|drop|3": {
"em": 0.02904781879194631,
"em_stderr": 0.0017198688690203193,
"f1": 0.09573615771812093,
"f1_stderr": 0.0021674728464020697
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.00981809072372729
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,676 | [
[
-0.0323486328125,
-0.045135498046875,
0.01314544677734375,
0.0160980224609375,
-0.0111236572265625,
0.004909515380859375,
-0.0257568359375,
-0.010162353515625,
0.0228271484375,
0.037384033203125,
-0.048828125,
-0.06756591796875,
-0.045989990234375,
0.0098342... |
ostapeno/platy_icl2_subset1.0_maxD50_3 | 2023-10-11T20:26:48.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-11T20:26:35 | ## model_setting_name: platy
## max_context_length: 512
## subset: 1.0
## icl_examples: 2
## icl_dataset_name: lukaemon/mmlu
## max_documents_per_subject: 50
## icl_use_out_options: True
## seed_dataset: sordonia/my-wiki-latex_mmlu_from_valid_all
## subjects: SUB_10
| 267 | [
[
-0.035400390625,
-0.0254364013671875,
0.026885986328125,
0.035888671875,
-0.0313720703125,
-0.0129547119140625,
-0.0013933181762695312,
0.019073486328125,
-0.006496429443359375,
0.03564453125,
-0.06964111328125,
-0.042083740234375,
-0.0277099609375,
0.021301... |
ostapeno/platy_icl5_subset1.0_maxD50_3 | 2023-10-11T20:35:36.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-11T20:31:11 | ## model_setting_name: platy
## max_context_length: 512
## subset: 1.0
## icl_examples: 5
## icl_dataset_name: lukaemon/mmlu
## max_documents_per_subject: 50
## icl_use_out_options: True
## seed_dataset: sordonia/my-wiki-latex_mmlu_from_valid_all
## subjects: SUB_10
| 267 | [
[
-0.03594970703125,
-0.024658203125,
0.027191162109375,
0.03497314453125,
-0.031829833984375,
-0.01296234130859375,
-0.0014400482177734375,
0.0177764892578125,
-0.00667572021484375,
0.034942626953125,
-0.06982421875,
-0.043365478515625,
-0.027587890625,
0.023... |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10 | 2023-10-11T20:34:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T20:33:00 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench10](https://huggingface.co/Undi95/Mistral-11B-TestBench10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T20:32:37.017457](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10/blob/main/results_2023-10-11T20-32-37.017457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389303587566675,\n\
\ \"acc_stderr\": 0.033080650268054235,\n \"acc_norm\": 0.6424809348544399,\n\
\ \"acc_norm_stderr\": 0.033059008030264514,\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5556543619352063,\n\
\ \"mc2_stderr\": 0.015507002997196854\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n\
\ \"acc_stderr\": 0.004749286071559565,\n \"acc_norm\": 0.8423620792670783,\n\
\ \"acc_norm_stderr\": 0.003636564286352674\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.02361088430892786,\n \
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.01270572149856511,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.01270572149856511\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.5556543619352063,\n\
\ \"mc2_stderr\": 0.015507002997196854\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-32-37.017457.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-32-37.017457.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-32-37.017457.parquet'
- config_name: results
data_files:
- split: 2023_10_11T20_32_37.017457
path:
- results_2023-10-11T20-32-37.017457.parquet
- split: latest
path:
- results_2023-10-11T20-32-37.017457.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench10
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench10
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench10](https://huggingface.co/Undi95/Mistral-11B-TestBench10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T20:32:37.017457](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench10/blob/main/results_2023-10-11T20-32-37.017457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389303587566675,
"acc_stderr": 0.033080650268054235,
"acc_norm": 0.6424809348544399,
"acc_norm_stderr": 0.033059008030264514,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5556543619352063,
"mc2_stderr": 0.015507002997196854
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916576
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559565,
"acc_norm": 0.8423620792670783,
"acc_norm_stderr": 0.003636564286352674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.01270572149856511,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.01270572149856511
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.017106588140700322,
"mc2": 0.5556543619352063,
"mc2_stderr": 0.015507002997196854
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,959 | [
[
-0.05072021484375,
-0.059234619140625,
0.0184173583984375,
0.01425933837890625,
-0.00997161865234375,
-0.00586700439453125,
0.0016050338745117188,
-0.01320648193359375,
0.0374755859375,
-0.00388336181640625,
-0.032379150390625,
-0.047882080078125,
-0.02893066406... |
Lexington120/test | 2023-10-11T20:50:35.000Z | [
"region:us"
] | Lexington120 | null | null | 0 | 0 | 2023-10-11T20:50:35 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
autoevaluate/autoeval-eval-sem_eval_2018_task_1-subtask5.english-f455b0-94519146125 | 2023-10-11T20:52:19.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-11T20:52:14 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
Lexington120/embedded_faqs_medicare | 2023-10-11T21:31:09.000Z | [
"region:us"
] | Lexington120 | null | null | 0 | 0 | 2023-10-11T21:26:46 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
ostapeno/platy_icl5_maxD50_maxC1000000_prmt10_3 | 2023-10-11T22:09:38.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-11T22:09:26 | ## model_setting_name: platy
## max_context_length: 512
## icl_examples: 5
## icl_dataset_name: lukaemon/mmlu
## max_documents_per_subject: 50
## max_contexts_per_subject: 1000000
## icl_use_out_options: True
## seed_dataset: sordonia/my-wiki-latex_mmlu_from_valid_all
## subjects: SUB_10
## response_template: 1
## inverse_template: 0
| 336 | [
[
-0.03765869140625,
-0.0240020751953125,
0.0254974365234375,
0.034759521484375,
-0.028472900390625,
-0.0207672119140625,
-0.0034770965576171875,
0.01617431640625,
-0.009735107421875,
0.0311737060546875,
-0.0648193359375,
-0.0394287109375,
-0.0264129638671875,
... |
mariana-guez/ff | 2023-10-11T22:45:29.000Z | [
"region:us"
] | mariana-guez | null | null | 0 | 0 | 2023-10-11T22:45:29 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1 | 2023-10-24T00:26:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T22:56:23 | ---
pretty_name: Evaluation run of unaidedelf87777/wizard-mistral-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [unaidedelf87777/wizard-mistral-v0.1](https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T00:26:05.989697](https://huggingface.co/datasets/open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1/blob/main/results_2023-10-24T00-26-05.989697.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005662751677852349,\n\
\ \"em_stderr\": 0.0007684582267637443,\n \"f1\": 0.07014261744966427,\n\
\ \"f1_stderr\": 0.0015546181894855703,\n \"acc\": 0.4866237666597055,\n\
\ \"acc_stderr\": 0.011199109496696186\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.005662751677852349,\n \"em_stderr\": 0.0007684582267637443,\n\
\ \"f1\": 0.07014261744966427,\n \"f1_stderr\": 0.0015546181894855703\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19029567854435178,\n \
\ \"acc_stderr\": 0.010812347283182963\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n\
\ }\n}\n```"
repo_url: https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|arc:challenge|25_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|drop|3_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T00-26-05.989697.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-26-05.989697.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hellaswag|10_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T22-55-59.459837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T22-55-59.459837.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T00_26_05.989697
path:
- '**/details_harness|winogrande|5_2023-10-24T00-26-05.989697.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T00-26-05.989697.parquet'
- config_name: results
data_files:
- split: 2023_10_11T22_55_59.459837
path:
- results_2023-10-11T22-55-59.459837.parquet
- split: 2023_10_24T00_26_05.989697
path:
- results_2023-10-24T00-26-05.989697.parquet
- split: latest
path:
- results_2023-10-24T00-26-05.989697.parquet
---
# Dataset Card for Evaluation run of unaidedelf87777/wizard-mistral-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [unaidedelf87777/wizard-mistral-v0.1](https://huggingface.co/unaidedelf87777/wizard-mistral-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T00:26:05.989697](https://huggingface.co/datasets/open-llm-leaderboard/details_unaidedelf87777__wizard-mistral-v0.1/blob/main/results_2023-10-24T00-26-05.989697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637443,
"f1": 0.07014261744966427,
"f1_stderr": 0.0015546181894855703,
"acc": 0.4866237666597055,
"acc_stderr": 0.011199109496696186
},
"harness|drop|3": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637443,
"f1": 0.07014261744966427,
"f1_stderr": 0.0015546181894855703
},
"harness|gsm8k|5": {
"acc": 0.19029567854435178,
"acc_stderr": 0.010812347283182963
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.01158587171020941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,742 | [
[
-0.0289154052734375,
-0.04693603515625,
0.0070648193359375,
0.018218994140625,
-0.005523681640625,
0.0002510547637939453,
-0.023773193359375,
-0.00679779052734375,
0.0250244140625,
0.04400634765625,
-0.05206298828125,
-0.06658935546875,
-0.05010986328125,
0.... |
ostapeno/platy_icl5_maxD10_maxC1000000_0 | 2023-10-11T23:44:41.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-11T23:25:31 | ## model_setting_name: platy
## max_context_length: 512
## icl_examples: 5
## icl_dataset_name: lukaemon/mmlu
## max_documents_per_subject: 10
## max_contexts_per_subject: 1000000
## icl_use_out_options: True
## seed_dataset: sordonia/my-wiki-latex_mmlu_from_valid_all
## subjects: SUB_1
| 288 | [
[
-0.037506103515625,
-0.0256500244140625,
0.0283050537109375,
0.038604736328125,
-0.031036376953125,
-0.0186004638671875,
-0.00507354736328125,
0.016082763671875,
-0.01251983642578125,
0.033203125,
-0.06292724609375,
-0.04205322265625,
-0.0265045166015625,
0.... |
ZelaAI/lj_speech_encodec | 2023-10-12T00:13:51.000Z | [
"region:us"
] | ZelaAI | null | null | 0 | 0 | 2023-10-12T00:02:56 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
jsrdhher/VOICETRAIN | 2023-11-03T00:15:33.000Z | [
"region:us"
] | jsrdhher | null | null | 0 | 0 | 2023-10-12T00:09:30 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
Meru32/Makoto | 2023-10-15T02:41:24.000Z | [
"region:us"
] | Meru32 | null | null | 0 | 0 | 2023-10-12T00:18:15 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4 | 2023-10-28T20:28:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-12T00:22:51 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v4](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T20:28:28.700078](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4/blob/main/results_2023-10-28T20-28-28.700078.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10329278523489933,\n\
\ \"em_stderr\": 0.003116735713102519,\n \"f1\": 0.1624748322147643,\n\
\ \"f1_stderr\": 0.003266242273162539,\n \"acc\": 0.442081101118795,\n\
\ \"acc_stderr\": 0.011112320094960076\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10329278523489933,\n \"em_stderr\": 0.003116735713102519,\n\
\ \"f1\": 0.1624748322147643,\n \"f1_stderr\": 0.003266242273162539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.009818090723727293\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|arc:challenge|25_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|drop|3_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T20-28-28.700078.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|gsm8k|5_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T20-28-28.700078.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hellaswag|10_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T00-22-26.630693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T00-22-26.630693.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T20_28_28.700078
path:
- '**/details_harness|winogrande|5_2023-10-28T20-28-28.700078.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T20-28-28.700078.parquet'
- config_name: results
data_files:
- split: 2023_10_12T00_22_26.630693
path:
- results_2023-10-12T00-22-26.630693.parquet
- split: 2023_10_28T20_28_28.700078
path:
- results_2023-10-28T20-28-28.700078.parquet
- split: latest
path:
- results_2023-10-28T20-28-28.700078.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v4](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T20:28:28.700078](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v4/blob/main/results_2023-10-28T20-28-28.700078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539,
"acc": 0.442081101118795,
"acc_stderr": 0.011112320094960076
},
"harness|drop|3": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727293
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,784 | [
[
-0.030853271484375,
-0.048583984375,
0.019744873046875,
0.01453399658203125,
-0.0117645263671875,
0.0028095245361328125,
-0.0242919921875,
-0.00862884521484375,
0.03509521484375,
0.047576904296875,
-0.05255126953125,
-0.06646728515625,
-0.047576904296875,
0.... |
WK1997/SSV_top10_classes | 2023-10-12T00:34:14.000Z | [
"region:us"
] | WK1997 | null | null | 0 | 0 | 2023-10-12T00:34:14 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
pharaouk/vqa_dataset | 2023-10-12T00:56:50.000Z | [
"region:us"
] | pharaouk | null | null | 0 | 0 | 2023-10-12T00:56:06 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
Miwafajri/Miwatest | 2023-10-12T01:04:06.000Z | [
"region:us"
] | Miwafajri | null | null | 0 | 0 | 2023-10-12T01:00:43 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
ContextualAI/spj-bert-large-cased | 2023-10-12T01:25:08.000Z | [
"region:us"
] | ContextualAI | null | null | 0 | 0 | 2023-10-12T01:25:08 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
BAAI/DataOptim | 2023-10-31T08:26:11.000Z | [
"task_categories:visual-question-answering",
"size_categories:1M<n<10M",
"language:en",
"region:us"
] | BAAI | null | null | 1 | 0 | 2023-10-12T01:30:44 | ---
task_categories:
- visual-question-answering
language:
- en
pretty_name: DataOptim
size_categories:
- 1M<n<10M
---
# DataOptim
We launch DataOptim, an MLLM benchmark and competition where we aim to find the optimal training data for training Multimodal Large Language Models (MLLMs).
- Project page: http://dataoptim.org
- GitHub: https://github.com/BAAI-DCAI/DataOptim
## Datasets
Currently, the visual instruction tuning data used in the challenge contain 17 public datasets.
More datasets are coming in the future!
|Category|Dataset|Images|Samples|Split|
|:-:|:-:|:-:|:-:|:-:|
|Image captioning|COCO|82783|414113|train|
|Image captioning|Flickr30K|29000|145000|Karpathy train split|
|Image captioning|TextCaps|21953|109765|train|
|Visual question answering|VQAv2|82783|443757|train|
|Visual question answering|OKVQA|8998|9009|train|
|Visual question answering|OCRVQA|166041|801673|train|
|Visual question answering|GQA|72140|943000|train|
|Visual question answering|TextVQA|21953|34602|train|
|Visual question answering|A-OKVQA|16540|17056|train|
|Visual question answering|ScienceQA|6218|6218|train|
|Visual question answering|Visual Genome QA (VGQA)|99280|1445322|-|
|Visual question answering|DocVQA|10194|39463|train|
|Visual question answering|DVQA|200000|2325316|train|
|Grounding|RefCOCO/RefCOCO+/RefCOCOg|24407|287604|train|
|Grounding|Shikra-RD|883|5922|train|
|GPT-4 generated|LLaVA-Instruct-150K|81479|157712|-|
|GPT-4 generated|SVIT|108076|2992799|-|
|Total||818K|10.4M|
We use different strategies to collect the prompts for different tasks.
- **Image captioning.** We carefully collect 5 manually written instructions and randomly sample one as the prompt for each caption. The fourth and fifth instructions are from [InstructBLIP](https://github.com/salesforce/LAVIS/blob/main/projects/instructblip/README.md).
- **Open-ended VQA.** As the answers in VQA datasets are generally short, we add an instruction after the question to ask the model to provide answers with a short sentence or phrase.
- **Multiple-choice VQA.** For OK-VQA, we add an instruction before the question to ask the model to provide answers with correct options. For ScienceQA, we use the instructions and templates designed by [M3IT](https://m3-it.github.io/) and randomly sample one to format the prompt. Only data with image context are involved.
- **Grounding.** We use the templates designed by [Shikra](https://github.com/shikras/shikra) and randomly sample one to format the prompt.
- **GPT-4 generated datasets.** We keep the prompts unchanged.
|Category|Data|Prompts|
|:-:|:-:|:-:|
|Image captioning|COCO, Flickr30K, TextCaps|Describe the image as simply as possible with a sentence or phrase.<br />Give a brief summary of what you see.<br />Provide a short description of the image.<br />Write a short description for the image.<br />Briefly describe the content of the image.|
|Open-ended VQA|VQAv2, OKVQA, OCRVQA, GQA, TextVQA|*question* Answer the question directly with a short sentence or phrase.|
|Multiple-choice VQA|A-OKVQA|Choose the correct option for the following question: *question*|
For now, the visual instruction tuning data are formatted in the training format of [LLaVA](https://github.com/haotian-liu/LLaVA) in [data](https://huggingface.co/datasets/BAAI/DataOptim/tree/main/data) folder. The images could be found in [images](https://huggingface.co/datasets/BAAI/DataOptim/tree/main/images) folder or the their official websites. | 3,466 | [
[
-0.04119873046875,
-0.056365966796875,
0.0276641845703125,
-0.002338409423828125,
-0.0214996337890625,
-0.00490570068359375,
-0.0111541748046875,
-0.00499725341796875,
-0.007678985595703125,
0.044708251953125,
-0.06365966796875,
-0.0528564453125,
-0.025985717773... |
chris-buenrostro/dataset-generator-cmb | 2023-10-12T01:34:25.000Z | [
"region:us"
] | chris-buenrostro | null | null | 0 | 0 | 2023-10-12T01:34:23 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 19952
num_examples: 10
download_size: 26112
dataset_size: 19952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset-generator-cmb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 528 | [
[
-0.056427001953125,
-0.028045654296875,
0.027801513671875,
0.0091094970703125,
-0.0310821533203125,
0.00891876220703125,
0.0112457275390625,
0.004180908203125,
0.06182861328125,
0.032989501953125,
-0.07501220703125,
-0.055572509765625,
-0.046539306640625,
-0... |
rdiazconcha/marketing-synthetic | 2023-10-12T01:39:43.000Z | [
"region:us"
] | rdiazconcha | null | null | 0 | 0 | 2023-10-12T01:39:41 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 20249
num_examples: 10
download_size: 27613
dataset_size: 20249
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing-synthetic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 526 | [
[
-0.032318115234375,
-0.034820556640625,
0.0025539398193359375,
0.0208282470703125,
0.006710052490234375,
0.0237884521484375,
-0.00020134449005126953,
-0.0251617431640625,
0.0675048828125,
0.03680419921875,
-0.07574462890625,
-0.051727294921875,
-0.01163482666015... |
beatbox1200/v-j-test2 | 2023-10-12T01:52:59.000Z | [
"region:us"
] | beatbox1200 | null | null | 0 | 0 | 2023-10-12T01:49:15 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
qureshiu/aci-weppages | 2023-10-12T02:03:41.000Z | [
"region:us"
] | qureshiu | null | null | 0 | 0 | 2023-10-12T02:02:36 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
pharaouk/bakllava_instruct_v | 2023-10-12T02:20:20.000Z | [
"region:us"
] | pharaouk | null | null | 0 | 0 | 2023-10-12T02:20:20 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
Harry-Li-27/SKILL | 2023-10-15T05:37:51.000Z | [
"region:us"
] | Harry-Li-27 | null | null | 0 | 0 | 2023-10-12T03:16:35 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
artdwn/sdxl-dataset | 2023-10-12T04:13:24.000Z | [
"region:us"
] | artdwn | null | null | 0 | 0 | 2023-10-12T04:09:22 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
xinqiyang/llama2_japanese_demo | 2023-10-12T05:04:27.000Z | [
"region:us"
] | xinqiyang | null | null | 0 | 0 | 2023-10-12T05:04:27 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.035064697265625,
0.0465087890625,
0.052490234375,
0.00505828857421875,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.03790283203... |
Greenvs/latian-test | 2023-10-12T05:10:26.000Z | [
"region:us"
] | Greenvs | null | null | 0 | 0 | 2023-10-12T05:07:26 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
harshi173/SkinClassifications | 2023-10-12T05:19:50.000Z | [
"region:us"
] | harshi173 | null | null | 0 | 0 | 2023-10-12T05:19:50 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
manu/code_20b | 2023-10-16T05:13:41.000Z | [
"region:us"
] | manu | null | null | 0 | 0 | 2023-10-12T05:37:03 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 66209111592
num_examples: 11692337
- name: test
num_bytes: 276152957
num_examples: 48689
download_size: 0
dataset_size: 66485264549
---
# Dataset Card for "code_20b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 614 | [
[
-0.05096435546875,
-0.0203857421875,
0.00653839111328125,
0.045806884765625,
-0.006069183349609375,
0.0135345458984375,
0.0157318115234375,
-0.0156707763671875,
0.050048828125,
0.039520263671875,
-0.050933837890625,
-0.058563232421875,
-0.036163330078125,
-0... |
artdwn/arknights | 2023-10-12T08:16:12.000Z | [
"region:us"
] | artdwn | null | null | 0 | 0 | 2023-10-12T05:49:47 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
artdwn/terra-dif-xl | 2023-10-12T05:53:14.000Z | [
"region:us"
] | artdwn | null | null | 0 | 0 | 2023-10-12T05:53:14 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.014984130859375,
0.05718994140625,
0.0288543701171875,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005062103271484375,
0.051361083984375,
0.016998291015625,
-0.0521240234375,
-0.01496124267578125,
-0.0604248046875,
0.037... |
manu/code_20b_separate | 2023-10-16T05:13:45.000Z | [
"region:us"
] | manu | null | null | 0 | 0 | 2023-10-12T06:07:10 | ---
configs:
- config_name: default
data_files:
- split: StarcoderdataPythonTest
path: data/StarcoderdataPythonTest-*
- split: StarcoderdataMarkdownTest
path: data/StarcoderdataMarkdownTest-*
- split: StarcoderdataJupyterScriptsDedupFilteredTest
path: data/StarcoderdataJupyterScriptsDedupFilteredTest-*
- split: StarcoderdataJupyterStructuredCleanDedupTest
path: data/StarcoderdataJupyterStructuredCleanDedupTest-*
- split: StarcoderdataJsonTest
path: data/StarcoderdataJsonTest-*
- split: CodeContestsTest
path: data/CodeContestsTest-*
- split: PypiCleanTest
path: data/PypiCleanTest-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: dataset_id
dtype: string
splits:
- name: StarcoderdataPythonTest
num_bytes: 45900630
num_examples: 10000
- name: StarcoderdataMarkdownTest
num_bytes: 40927519
num_examples: 10000
- name: StarcoderdataJupyterScriptsDedupFilteredTest
num_bytes: 15297731
num_examples: 1829
- name: StarcoderdataJupyterStructuredCleanDedupTest
num_bytes: 12631734
num_examples: 1337
- name: StarcoderdataJsonTest
num_bytes: 8853154
num_examples: 7127
- name: CodeContestsTest
num_bytes: 28120884
num_examples: 8396
- name: PypiCleanTest
num_bytes: 124421305
num_examples: 10000
download_size: 0
dataset_size: 276152957
---
# Dataset Card for "code_20b_separate"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 1,586 | [
[
-0.056060791015625,
-0.024749755859375,
-0.00591278076171875,
0.04901123046875,
-0.0169525146484375,
0.023223876953125,
0.007781982421875,
-0.020355224609375,
0.0537109375,
0.046417236328125,
-0.05023193359375,
-0.05059814453125,
-0.035858154296875,
-0.00632... |
heegyu/chart2text_statista | 2023-10-12T07:02:23.000Z | [
"region:us"
] | heegyu | null | null | 0 | 0 | 2023-10-12T06:42:42 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: dataPath
dtype: string
- name: imgPath
dtype: string
- name: caption
dtype: string
- name: first_caption
dtype: string
- name: chartType
dtype: string
- name: release date
dtype: string
- name: Region
dtype: string
- name: survey time period
dtype: string
- name: xAxis
dtype: string
- name: yAxis
dtype: string
- name: URL
dtype: string
- name: image
dtype: image
- name: data
dtype: string
- name: columns
dtype: string
- name: length
dtype: float64
splits:
- name: train
num_bytes: 1034457048.216
num_examples: 24368
- name: val
num_bytes: 223876316.638
num_examples: 5221
- name: test
num_bytes: 224020677.682
num_examples: 5222
download_size: 763065167
dataset_size: 1482354042.536
---
# Dataset Card for "chart2text_statista"
original dataset: https://github.com/vis-nlp/Chart-to-text | 1,019 | [
[
0.0009794235229492188,
-0.012298583984375,
0.0003991127014160156,
0.013214111328125,
-0.04803466796875,
0.00257110595703125,
-0.0177459716796875,
-0.022003173828125,
0.0391845703125,
0.05072021484375,
-0.027313232421875,
-0.053741455078125,
-0.037872314453125,
... |
Om007/kendal_bot | 2023-10-12T06:59:42.000Z | [
"task_categories:question-answering",
"language:en",
"region:us"
] | Om007 | null | null | 0 | 0 | 2023-10-12T06:47:35 | ---
task_categories:
- question-answering
language:
- en
---
# Dataset Card for Kendal
<!-- Provide a quick summary of the dataset. -->
This is a dataset of for Kendal Bot.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | 4,221 | [
[
-0.0360107421875,
-0.0435791015625,
0.0233612060546875,
0.00968170166015625,
-0.019775390625,
0.0019989013671875,
-0.00534820556640625,
-0.037933349609375,
0.034149169921875,
0.05938720703125,
-0.054473876953125,
-0.06304931640625,
-0.034637451171875,
0.0041... |
BoyaWu10/Test | 2023-10-12T06:54:06.000Z | [
"region:us"
] | BoyaWu10 | null | null | 0 | 0 | 2023-10-12T06:54:06 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.014984130859375,
0.05718994140625,
0.0288543701171875,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005062103271484375,
0.051361083984375,
0.016998291015625,
-0.0521240234375,
-0.01496124267578125,
-0.0604248046875,
0.037... |
Aoschu/donut_invoice | 2023-10-12T07:22:27.000Z | [
"region:us"
] | Aoschu | null | null | 0 | 0 | 2023-10-12T07:20:31 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.014984130859375,
0.05718994140625,
0.0288543701171875,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005062103271484375,
0.051361083984375,
0.016998291015625,
-0.0521240234375,
-0.01496124267578125,
-0.0604248046875,
0.037... |
simwit/20231012-test | 2023-10-12T07:29:26.000Z | [
"region:us"
] | simwit | null | null | 0 | 0 | 2023-10-12T07:27:37 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
autoevaluate/autoeval-eval-ade_corpus_v2-Ade_corpus_v2_classification-376fff-94601146168 | 2023-10-12T07:32:27.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-12T07:32:24 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.035064697265625,
0.0465087890625,
0.052520751953125,
0.00507354736328125,
0.0513916015625,
0.0170135498046875,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
autoevaluate/autoeval-eval-ade_corpus_v2-Ade_corpus_v2_classification-79653b-94602146169 | 2023-10-12T07:32:32.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-12T07:32:28 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.035064697265625,
0.0465087890625,
0.052520751953125,
0.00507354736328125,
0.0513916015625,
0.0170135498046875,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
taewon3779/DGB_Project | 2023-10-12T07:34:32.000Z | [
"region:us"
] | taewon3779 | null | null | 0 | 0 | 2023-10-12T07:33:12 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.035064697265625,
0.0465087890625,
0.052520751953125,
0.00507354736328125,
0.0513916015625,
0.0170135498046875,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
arthurdubrou/Bird_explained_corrections | 2023-10-13T15:26:32.000Z | [
"license:apache-2.0",
"region:us"
] | arthurdubrou | null | null | 0 | 0 | 2023-10-12T07:37:14 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
This dataset is truncated | 88 | [
[
-0.006793975830078125,
-0.0159912109375,
-0.0003504753112792969,
0.00975799560546875,
-0.060821533203125,
0.036224365234375,
0.01334381103515625,
0.01285552978515625,
0.0289154052734375,
0.025146484375,
-0.059539794921875,
-0.0206298828125,
-0.05828857421875,
... |
bongo2112/mixed-SDXL-Video-Outputs_v2 | 2023-10-12T10:41:42.000Z | [
"region:us"
] | bongo2112 | null | null | 0 | 0 | 2023-10-12T08:00:02 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
S3Eval/General | 2023-10-12T13:15:04.000Z | [
"region:us"
] | S3Eval | null | null | 0 | 0 | 2023-10-12T08:11:02 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B | 2023-10-24T18:24:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-12T08:33:46 | ---
pretty_name: Evaluation run of teknium/CollectiveCognition-v1.1-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T18:24:08.168024](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B/blob/main/results_2023-10-24T18-24-08.168024.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14481963087248323,\n\
\ \"em_stderr\": 0.003603978827087507,\n \"f1\": 0.19846161912751598,\n\
\ \"f1_stderr\": 0.0036570269650408635,\n \"acc\": 0.45496396842218007,\n\
\ \"acc_stderr\": 0.011053937338597487\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.14481963087248323,\n \"em_stderr\": 0.003603978827087507,\n\
\ \"f1\": 0.19846161912751598,\n \"f1_stderr\": 0.0036570269650408635\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1561789234268385,\n \
\ \"acc_stderr\": 0.00999950936975745\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437523\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|drop|3_2023-10-24T18-24-08.168024.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T18-24-08.168024.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-24-08.168024.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-24-08.168024.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-33-23.557832.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-33-23.557832.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T18_24_08.168024
path:
- '**/details_harness|winogrande|5_2023-10-24T18-24-08.168024.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T18-24-08.168024.parquet'
- config_name: results
data_files:
- split: 2023_10_12T08_33_23.557832
path:
- results_2023-10-12T08-33-23.557832.parquet
- split: 2023_10_24T18_24_08.168024
path:
- results_2023-10-24T18-24-08.168024.parquet
- split: latest
path:
- results_2023-10-24T18-24-08.168024.parquet
---
# Dataset Card for Evaluation run of teknium/CollectiveCognition-v1.1-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T18:24:08.168024](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B/blob/main/results_2023-10-24T18-24-08.168024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14481963087248323,
"em_stderr": 0.003603978827087507,
"f1": 0.19846161912751598,
"f1_stderr": 0.0036570269650408635,
"acc": 0.45496396842218007,
"acc_stderr": 0.011053937338597487
},
"harness|drop|3": {
"em": 0.14481963087248323,
"em_stderr": 0.003603978827087507,
"f1": 0.19846161912751598,
"f1_stderr": 0.0036570269650408635
},
"harness|gsm8k|5": {
"acc": 0.1561789234268385,
"acc_stderr": 0.00999950936975745
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437523
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,831 | [
[
-0.027099609375,
-0.0399169921875,
0.0162200927734375,
0.0096588134765625,
-0.0160369873046875,
0.01654052734375,
-0.0281524658203125,
-0.01212310791015625,
0.0284576416015625,
0.0343017578125,
-0.048187255859375,
-0.0758056640625,
-0.051544189453125,
0.0179... |
open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B | 2023-10-29T01:40:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-12T08:39:42 | ---
pretty_name: Evaluation run of teknium/CollectiveCognition-v1-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/CollectiveCognition-v1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T01:40:21.634950](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B/blob/main/results_2023-10-29T01-40-21.634950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014786073825503355,\n\
\ \"em_stderr\": 0.0012360366760473097,\n \"f1\": 0.07218645134228192,\n\
\ \"f1_stderr\": 0.0017555798787673934,\n \"acc\": 0.47738594388492395,\n\
\ \"acc_stderr\": 0.011139031066837696\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473097,\n\
\ \"f1\": 0.07218645134228192,\n \"f1_stderr\": 0.0017555798787673934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17892342683851403,\n \
\ \"acc_stderr\": 0.010557661392901294\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774099\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/CollectiveCognition-v1-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T01_40_21.634950
path:
- '**/details_harness|drop|3_2023-10-29T01-40-21.634950.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T01-40-21.634950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T01_40_21.634950
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-40-21.634950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-40-21.634950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-39-18.628472.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-39-18.628472.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-39-18.628472.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T01_40_21.634950
path:
- '**/details_harness|winogrande|5_2023-10-29T01-40-21.634950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T01-40-21.634950.parquet'
- config_name: results
data_files:
- split: 2023_10_12T08_39_18.628472
path:
- results_2023-10-12T08-39-18.628472.parquet
- split: 2023_10_29T01_40_21.634950
path:
- results_2023-10-29T01-40-21.634950.parquet
- split: latest
path:
- results_2023-10-29T01-40-21.634950.parquet
---
# Dataset Card for Evaluation run of teknium/CollectiveCognition-v1-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/CollectiveCognition-v1-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/CollectiveCognition-v1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T01:40:21.634950](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B/blob/main/results_2023-10-29T01-40-21.634950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.014786073825503355,
"em_stderr": 0.0012360366760473097,
"f1": 0.07218645134228192,
"f1_stderr": 0.0017555798787673934,
"acc": 0.47738594388492395,
"acc_stderr": 0.011139031066837696
},
"harness|drop|3": {
"em": 0.014786073825503355,
"em_stderr": 0.0012360366760473097,
"f1": 0.07218645134228192,
"f1_stderr": 0.0017555798787673934
},
"harness|gsm8k|5": {
"acc": 0.17892342683851403,
"acc_stderr": 0.010557661392901294
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774099
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,818 | [
[
-0.0274200439453125,
-0.041717529296875,
0.0170440673828125,
0.00951385498046875,
-0.015869140625,
0.0170135498046875,
-0.0281219482421875,
-0.011077880859375,
0.0278778076171875,
0.035369873046875,
-0.048553466796875,
-0.075927734375,
-0.050750732421875,
0.... |
SouravModak/super-cool-weed | 2023-10-12T16:49:58.000Z | [
"region:us"
] | SouravModak | null | null | 0 | 0 | 2023-10-12T08:40:35 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 22631043.6
num_examples: 1300
download_size: 0
dataset_size: 22631043.6
---
# Dataset Card for "super-cool-weed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 443 | [
[
-0.03289794921875,
-0.034942626953125,
0.0166473388671875,
0.0056610107421875,
-0.0154876708984375,
0.0105743408203125,
0.00760650634765625,
-0.007511138916015625,
0.07666015625,
0.0180511474609375,
-0.057464599609375,
-0.0660400390625,
-0.039154052734375,
-... |
sammyontheshow/en_hr_para | 2023-10-12T08:41:31.000Z | [
"region:us"
] | sammyontheshow | null | null | 0 | 0 | 2023-10-12T08:40:41 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01494598388671875,
0.057159423828125,
0.028839111328125,
-0.0350341796875,
0.04656982421875,
0.052490234375,
0.00504302978515625,
0.0513916015625,
0.016998291015625,
-0.0521240234375,
-0.0149993896484375,
-0.06036376953125,
0.03790283... |
open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B | 2023-10-25T09:46:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-12T08:45:48 | ---
pretty_name: Evaluation run of teknium/Mistral-Trismegistus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/Mistral-Trismegistus-7B](https://huggingface.co/teknium/Mistral-Trismegistus-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T09:46:08.723071](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B/blob/main/results_2023-10-25T09-46-08.723071.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.010591442953020135,\n\
\ \"em_stderr\": 0.0010483469790502314,\n \"f1\": 0.07238674496644287,\n\
\ \"f1_stderr\": 0.001675223530701393,\n \"acc\": 0.4004875617305928,\n\
\ \"acc_stderr\": 0.010548628211357203\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.010591442953020135,\n \"em_stderr\": 0.0010483469790502314,\n\
\ \"f1\": 0.07238674496644287,\n \"f1_stderr\": 0.001675223530701393\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7016574585635359,\n \"acc_stderr\": 0.012858885010030421\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/Mistral-Trismegistus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|drop|3_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T09-46-08.723071.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T09-46-08.723071.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-45-24.509522.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-12T08-45-24.509522.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T09_46_08.723071
path:
- '**/details_harness|winogrande|5_2023-10-25T09-46-08.723071.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T09-46-08.723071.parquet'
- config_name: results
data_files:
- split: 2023_10_12T08_45_24.509522
path:
- results_2023-10-12T08-45-24.509522.parquet
- split: 2023_10_25T09_46_08.723071
path:
- results_2023-10-25T09-46-08.723071.parquet
- split: latest
path:
- results_2023-10-25T09-46-08.723071.parquet
---
# Dataset Card for Evaluation run of teknium/Mistral-Trismegistus-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/Mistral-Trismegistus-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/Mistral-Trismegistus-7B](https://huggingface.co/teknium/Mistral-Trismegistus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T09:46:08.723071](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B/blob/main/results_2023-10-25T09-46-08.723071.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502314,
"f1": 0.07238674496644287,
"f1_stderr": 0.001675223530701393,
"acc": 0.4004875617305928,
"acc_stderr": 0.010548628211357203
},
"harness|drop|3": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502314,
"f1": 0.07238674496644287,
"f1_stderr": 0.001675223530701393
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683985
},
"harness|winogrande|5": {
"acc": 0.7016574585635359,
"acc_stderr": 0.012858885010030421
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,692 | [
[
-0.028564453125,
-0.0421142578125,
0.020050048828125,
0.011688232421875,
-0.0175933837890625,
0.014739990234375,
-0.027801513671875,
-0.003475189208984375,
0.024566650390625,
0.035736083984375,
-0.047576904296875,
-0.075927734375,
-0.04669189453125,
0.016067... |
yrehan32/llama2-wiki-medical-terms | 2023-10-12T08:52:20.000Z | [
"region:us"
] | yrehan32 | null | null | 0 | 0 | 2023-10-12T08:47:40 | Entry not found | 15 | [
[
-0.021392822265625,
-0.0149688720703125,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.046539306640625,
0.052520751953125,
0.005046844482421875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.01495361328125,
-0.060333251953125,
0.03... |
wjeon/wageinusa_test | 2023-10-12T08:56:27.000Z | [
"region:us"
] | wjeon | null | null | 0 | 0 | 2023-10-12T08:56:05 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.