datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
liuyanchen1015/MULTI_VALUE_sst2_null_genitive | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 18658
num_examples: 120
- name: test
num_bytes: 41399
num_examples: 265
- name: train
num_bytes: 541116
num_examples: 4562
download_size: 348124
dataset_size: 601173
---
# Dataset Card for "MULTI_VALUE_sst2_null_genitive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bn22/albanian-corpora-bert-tokenized-2k-tinyllama | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 5372481200
num_examples: 201700
download_size: 1190977215
dataset_size: 5372481200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pravsels/AnimationsWithManim_Elteoremadebeethoven_issues | ---
dataset_info:
features:
- name: number
dtype: int64
- name: content
dtype: string
- name: comments
sequence: string
splits:
- name: train
num_bytes: 220070
num_examples: 108
download_size: 60030
dataset_size: 220070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_arc_en_dynamic | ---
dataset_info:
features:
- name: keys
dtype: string
- name: values
sequence: string
splits:
- name: train
num_bytes: 127105
num_examples: 250
download_size: 12369
dataset_size: 127105
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
storytracer/German-PD-Newspapers | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- de
tags:
- newspapers
- ocr
- public domain
pretty_name: Public Domain Newspapers (German)
size_categories:
- 10B<n<100B
---
# Dataset Card for Public Domain Newspapers (German)
<!-- Provide a quick summary of the dataset. -->
This dataset contains 13 billion words of OCR text extracted from German historical newspapers.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [Sebastian Majstorovic](https://www.storytracer.org)
- **Language(s) (NLP):** German
- **License:** Dataset: CC0, Texts: Public Domain
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://www.deutsche-digitale-bibliothek.de/newspaper
### Copyright & License
The newspapers texts have been determined to be in the Public Domain by the institutions who provided them to the newspaper portal of the German Digital National Library. The dataset itself, excluding the texts, is licensed under the [CC0 license](https://creativecommons.org/public-domain/cc0/). |
helloerikaaa/solar-activity-history | ---
language:
- en
license: mit
tags:
- timeseries
- astronomy
- solar-activity
annotations_creators:
- found
pretty_name: SolarActivity
task_categories:
- time-series-forecasting
dataset_info:
features:
- name: date
dtype: datetime
- name: year
dtype: int
- name: month
dtype: int
- name: day
dtype: int
- name: datefracc
dtype: float
- name: counts
dtype: int
- name: std
dtype: float
- name: nobs
dtype: int
- name: indicator
dtype: string
size_categories:
- 10K<n<100K
--- |
xbilek25/test_cs | ---
license: mit
--- |
dischargesum/radiology | ---
dataset_info:
features:
- name: note_id
dtype: string
- name: subject_id
dtype: int64
- name: hadm_id
dtype: int64
- name: note_type
dtype: string
- name: note_seq
dtype: int64
- name: charttime
dtype: string
- name: storetime
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 361972454
num_examples: 259304
- name: valid
num_bytes: 76603854
num_examples: 54650
- name: test
num_bytes: 76383946
num_examples: 54797
download_size: 238441620
dataset_size: 514960254
---
# Dataset Card for "radiology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713182417 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28504
num_examples: 66
download_size: 15100
dataset_size: 28504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingface/autotrain-data-imgstg2 | Invalid username or password. |
PJMixers/Math-100K | ---
language:
- en
tags:
- math
size_categories:
- 100K<n<1M
---
Created with [this script](https://gist.github.com/xzuyn/5807bbc2a305590f7b49b879dc0354ad), so I assume everything is 100% correct (with rounding). |
zolak/twitter_dataset_80_1713123591 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 315858
num_examples: 797
download_size: 170357
dataset_size: 315858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_moreh__MoMo-70B-lora-1.8.6-DPO | ---
pretty_name: Evaluation run of moreh/MoMo-72B-lora-1.8.6-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [moreh/MoMo-72B-lora-1.8.6-DPO](https://huggingface.co/moreh/MoMo-72B-lora-1.8.6-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.6-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T21:58:20.611483](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.6-DPO/blob/main/results_2024-01-16T21-58-20.611483.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7718135866116949,\n\
\ \"acc_stderr\": 0.027923193716335594,\n \"acc_norm\": 0.7742387772387228,\n\
\ \"acc_norm_stderr\": 0.02847436706882802,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6899803980341069,\n\
\ \"mc2_stderr\": 0.01529930152264664\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946526,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068742\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6712806213901613,\n\
\ \"acc_stderr\": 0.004687877183164464,\n \"acc_norm\": 0.8602867954590719,\n\
\ \"acc_norm_stderr\": 0.0034598069913898376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8881578947368421,\n \"acc_stderr\": 0.02564834125169361,\n\
\ \"acc_norm\": 0.8881578947368421,\n \"acc_norm_stderr\": 0.02564834125169361\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n\
\ \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8264150943396227,\n \"acc_stderr\": 0.02331058302600625,\n\
\ \"acc_norm\": 0.8264150943396227,\n \"acc_norm_stderr\": 0.02331058302600625\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n\
\ \"acc_stderr\": 0.021257974822832055,\n \"acc_norm\": 0.9305555555555556,\n\
\ \"acc_norm_stderr\": 0.021257974822832055\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7687861271676301,\n\
\ \"acc_stderr\": 0.03214737302029468,\n \"acc_norm\": 0.7687861271676301,\n\
\ \"acc_norm_stderr\": 0.03214737302029468\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.02635515841334942,\n\
\ \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.02635515841334942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6904761904761905,\n\
\ \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.6904761904761905,\n\
\ \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n\
\ \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034516,\n\
\ \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034516\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"\
acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909046,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909046\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8025641025641026,\n \"acc_stderr\": 0.020182646968674847,\n\
\ \"acc_norm\": 0.8025641025641026,\n \"acc_norm_stderr\": 0.020182646968674847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4777777777777778,\n \"acc_stderr\": 0.030455413985678408,\n \
\ \"acc_norm\": 0.4777777777777778,\n \"acc_norm_stderr\": 0.030455413985678408\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.02327425589870794,\n \
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.02327425589870794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"\
acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"\
acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6944444444444444,\n \"acc_stderr\": 0.031415546294025425,\n \"\
acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.031415546294025425\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917947,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917947\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553848,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553848\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9182630906768838,\n\
\ \"acc_stderr\": 0.00979691395231317,\n \"acc_norm\": 0.9182630906768838,\n\
\ \"acc_norm_stderr\": 0.00979691395231317\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442262,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442262\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7016759776536313,\n\
\ \"acc_stderr\": 0.015301840045129285,\n \"acc_norm\": 0.7016759776536313,\n\
\ \"acc_norm_stderr\": 0.015301840045129285\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043714,\n\
\ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043714\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8553054662379421,\n\
\ \"acc_stderr\": 0.019980476411175545,\n \"acc_norm\": 0.8553054662379421,\n\
\ \"acc_norm_stderr\": 0.019980476411175545\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062065,\n\
\ \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062065\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6595744680851063,\n \"acc_stderr\": 0.02826765748265015,\n \
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.02826765748265015\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6140808344198174,\n\
\ \"acc_stderr\": 0.012433398911476141,\n \"acc_norm\": 0.6140808344198174,\n\
\ \"acc_norm_stderr\": 0.012433398911476141\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010112983,\n\
\ \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010112983\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8186274509803921,\n \"acc_stderr\": 0.015588643495370457,\n \
\ \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.015588643495370457\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429093,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429093\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6899803980341069,\n\
\ \"mc2_stderr\": 0.01529930152264664\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873494\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7680060652009097,\n \
\ \"acc_stderr\": 0.011626873175092412\n }\n}\n```"
repo_url: https://huggingface.co/moreh/MoMo-72B-lora-1.8.6-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|arc:challenge|25_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|arc:challenge|25_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|gsm8k|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|gsm8k|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hellaswag|10_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hellaswag|10_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-53-27.045677.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-58-20.611483.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T21-58-20.611483.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- '**/details_harness|winogrande|5_2024-01-16T21-53-27.045677.parquet'
- split: 2024_01_16T21_58_20.611483
path:
- '**/details_harness|winogrande|5_2024-01-16T21-58-20.611483.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T21-58-20.611483.parquet'
- config_name: results
data_files:
- split: 2024_01_16T21_53_27.045677
path:
- results_2024-01-16T21-53-27.045677.parquet
- split: 2024_01_16T21_58_20.611483
path:
- results_2024-01-16T21-58-20.611483.parquet
- split: latest
path:
- results_2024-01-16T21-58-20.611483.parquet
---
# Dataset Card for Evaluation run of moreh/MoMo-72B-lora-1.8.6-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-lora-1.8.6-DPO](https://huggingface.co/moreh/MoMo-72B-lora-1.8.6-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.6-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T21:58:20.611483](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-lora-1.8.6-DPO/blob/main/results_2024-01-16T21-58-20.611483.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7718135866116949,
"acc_stderr": 0.027923193716335594,
"acc_norm": 0.7742387772387228,
"acc_norm_stderr": 0.02847436706882802,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6899803980341069,
"mc2_stderr": 0.01529930152264664
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.013640943091946526,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068742
},
"harness|hellaswag|10": {
"acc": 0.6712806213901613,
"acc_stderr": 0.004687877183164464,
"acc_norm": 0.8602867954590719,
"acc_norm_stderr": 0.0034598069913898376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8881578947368421,
"acc_stderr": 0.02564834125169361,
"acc_norm": 0.8881578947368421,
"acc_norm_stderr": 0.02564834125169361
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8264150943396227,
"acc_stderr": 0.02331058302600625,
"acc_norm": 0.8264150943396227,
"acc_norm_stderr": 0.02331058302600625
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9305555555555556,
"acc_stderr": 0.021257974822832055,
"acc_norm": 0.9305555555555556,
"acc_norm_stderr": 0.021257974822832055
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.03214737302029468,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.03214737302029468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.02635515841334942,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.02635515841334942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.0333333333333333,
"acc_norm": 0.8,
"acc_norm_stderr": 0.0333333333333333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01699999492742161,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01699999492742161
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909046,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8025641025641026,
"acc_stderr": 0.020182646968674847,
"acc_norm": 0.8025641025641026,
"acc_norm_stderr": 0.020182646968674847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4777777777777778,
"acc_stderr": 0.030455413985678408,
"acc_norm": 0.4777777777777778,
"acc_norm_stderr": 0.030455413985678408
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.02327425589870794,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.02327425589870794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.031415546294025425,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.031415546294025425
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917947,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917947
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553848,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553848
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9182630906768838,
"acc_stderr": 0.00979691395231317,
"acc_norm": 0.9182630906768838,
"acc_norm_stderr": 0.00979691395231317
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442262,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7016759776536313,
"acc_stderr": 0.015301840045129285,
"acc_norm": 0.7016759776536313,
"acc_norm_stderr": 0.015301840045129285
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043714,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043714
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8553054662379421,
"acc_stderr": 0.019980476411175545,
"acc_norm": 0.8553054662379421,
"acc_norm_stderr": 0.019980476411175545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062065,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062065
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.02826765748265015,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.02826765748265015
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6140808344198174,
"acc_stderr": 0.012433398911476141,
"acc_norm": 0.6140808344198174,
"acc_norm_stderr": 0.012433398911476141
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010112983,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010112983
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.015588643495370457,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.015588643495370457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.027265992434429093,
"acc_norm": 0.92,
"acc_norm_stderr": 0.027265992434429093
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6899803980341069,
"mc2_stderr": 0.01529930152264664
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873494
},
"harness|gsm8k|5": {
"acc": 0.7680060652009097,
"acc_stderr": 0.011626873175092412
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HuggingKG/bitirme-ds-mini | ---
dataset_info:
features:
- name: Customer
dtype: string
- name: Customer Service
dtype: string
- name: customer_length
dtype: int64
- name: customerService_length
dtype: int64
splits:
- name: train
num_bytes: 190980.9423076923
num_examples: 748
- name: validation
num_bytes: 21447.05769230769
num_examples: 84
download_size: 64395
dataset_size: 212428.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-64-nsample-2 | ---
dataset_info:
- config_name: alpaca_instructions-pythia_14m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
splits:
- name: preference
num_bytes: 25315216
num_examples: 20001
download_size: 12112309
dataset_size: 25315216
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
splits:
- name: preference
num_bytes: 25451634
num_examples: 20001
download_size: 12144402
dataset_size: 25451634
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
splits:
- name: preference
num_bytes: 25276914
num_examples: 20001
download_size: 11799025
dataset_size: 25276914
configs:
- config_name: alpaca_instructions-pythia_14m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_14m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_my_me | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 105336
num_examples: 459
- name: dev_mismatched
num_bytes: 97326
num_examples: 470
- name: test_matched
num_bytes: 109442
num_examples: 449
- name: test_mismatched
num_bytes: 112226
num_examples: 507
- name: train
num_bytes: 3823509
num_examples: 16768
download_size: 2449239
dataset_size: 4247839
---
# Dataset Card for "MULTI_VALUE_mnli_my_me"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
3bdtech/abdalrahman | ---
license: openrail
---
|
YawKar/wikitext_with_entitled_paragraphs | ---
size_categories:
- 100K<n<1M
source_datasets:
- wikitext
task_categories:
- text-generation
dataset_info:
features:
- name: heading
dtype: string
- name: paragraph
dtype: string
splits:
- name: test
num_bytes: 1381448
num_examples: 2187
- name: train
num_bytes: 574572095
num_examples: 860879
- name: validation
num_bytes: 1219110
num_examples: 1841
download_size: 313631586
dataset_size: 577172653
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
dvijay/databricks-dolly-15k-formatted | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 12195589
num_examples: 15011
download_size: 7749038
dataset_size: 12195589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
llm-book/livedoor-news-corpus | ---
task_categories:
- summarization
language:
- ja
tags:
- news
pretty_name: livedoor-news-corpus
size_categories:
- 1K<n<10K
---
# Dataset Card for llm-book/ner-wikinews-dataset
書籍『大規模言語モデル入門』で使用する、株式会社ロンウイットが提供する「livedoorニュースコーパス」によるデータセットです。
[オリジナルのサイト](https://www.rondhuit.com/download.html)と同じものを使用しています。
本コーパスは、NHN Japan株式会社が運営する「livedoor ニュース」のうち、下記のクリエイティブ・コモンズライセンスが適用されるニュース記事を収集し、可能な限りHTMLタグを取り除いて作成したものです。
### Licence
Attribution-NoDerivs 2.1 Japan (CC BY-ND 2.1 JP) License |
autoevaluate/autoeval-eval-xsum-default-98b05d-39746145061 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: stacked-summaries/flan-t5-large-stacked-xsum-1024
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: stacked-summaries/flan-t5-large-stacked-xsum-1024
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
open-llm-leaderboard/details_Enoch__llama-65b-hf | ---
pretty_name: Evaluation run of Enoch/llama-65b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Enoch/llama-65b-hf](https://huggingface.co/Enoch/llama-65b-hf) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Enoch__llama-65b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-05T05:06:29.042599](https://huggingface.co/datasets/open-llm-leaderboard/details_Enoch__llama-65b-hf/blob/main/results_2023-12-05T05-06-29.042599.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6380517777268255,\n\
\ \"acc_stderr\": 0.032178718879849834,\n \"acc_norm\": 0.6421210460838432,\n\
\ \"acc_norm_stderr\": 0.0328302725617492,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43425303494253065,\n\
\ \"mc2_stderr\": 0.013768101142659904\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449708,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6650069707229636,\n\
\ \"acc_stderr\": 0.004710234188047369,\n \"acc_norm\": 0.8608842859988051,\n\
\ \"acc_norm_stderr\": 0.003453599726736566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697032,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697032\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.045595221419582166,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.045595221419582166\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"\
acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121448,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121448\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633506,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099864,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.03324708911809117,\n \"\
acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489124,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489124\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.016712467441702517,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.016712467441702517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426125,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426125\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4941329856584094,\n\
\ \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.4941329856584094,\n\
\ \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291282,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291282\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43425303494253065,\n\
\ \"mc2_stderr\": 0.013768101142659904\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.824782951854775,\n \"acc_stderr\": 0.010684179227706175\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44806671721000757,\n \
\ \"acc_stderr\": 0.013697992668274522\n }\n}\n```"
repo_url: https://huggingface.co/Enoch/llama-65b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|arc:challenge|25_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|gsm8k|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hellaswag|10_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-05T05-06-29.042599.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- '**/details_harness|winogrande|5_2023-12-05T05-06-29.042599.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-05T05-06-29.042599.parquet'
- config_name: results
data_files:
- split: 2023_12_05T05_06_29.042599
path:
- results_2023-12-05T05-06-29.042599.parquet
- split: latest
path:
- results_2023-12-05T05-06-29.042599.parquet
---
# Dataset Card for Evaluation run of Enoch/llama-65b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Enoch/llama-65b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Enoch/llama-65b-hf](https://huggingface.co/Enoch/llama-65b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Enoch__llama-65b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-05T05:06:29.042599](https://huggingface.co/datasets/open-llm-leaderboard/details_Enoch__llama-65b-hf/blob/main/results_2023-12-05T05-06-29.042599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6380517777268255,
"acc_stderr": 0.032178718879849834,
"acc_norm": 0.6421210460838432,
"acc_norm_stderr": 0.0328302725617492,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43425303494253065,
"mc2_stderr": 0.013768101142659904
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449708,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6650069707229636,
"acc_stderr": 0.004710234188047369,
"acc_norm": 0.8608842859988051,
"acc_norm_stderr": 0.003453599726736566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697032,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697032
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.045595221419582166,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.045595221419582166
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121448,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121448
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099864,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489124,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489124
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.016712467441702517,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.016712467441702517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426125,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426125
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4941329856584094,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.4941329856584094,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291282,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291282
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43425303494253065,
"mc2_stderr": 0.013768101142659904
},
"harness|winogrande|5": {
"acc": 0.824782951854775,
"acc_stderr": 0.010684179227706175
},
"harness|gsm8k|5": {
"acc": 0.44806671721000757,
"acc_stderr": 0.013697992668274522
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
andrewatef/PText | ---
dataset_info:
- config_name: articles
features:
- name: input
dtype: string
- name: output
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 12376328.0
num_examples: 2040
download_size: 5623581
dataset_size: 12376328.0
- config_name: articles2
features:
- name: title
dtype: string
- name: description
dtype: string
- name: reading_time_minutes
dtype: int64
- name: tags
dtype: string
- name: body_markdown
dtype: string
splits:
- name: train
num_bytes: 2567410.0
num_examples: 1090
download_size: 1362235
dataset_size: 2567410.0
- config_name: llama
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 291896975.0
num_examples: 1257591
download_size: 153320452
dataset_size: 291896975.0
- config_name: llama2
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 170086868.0
num_examples: 516177
download_size: 83326571
dataset_size: 170086868.0
- config_name: llama3
features:
- name: Instruction
dtype: string
- name: Response
dtype: string
splits:
- name: train
num_bytes: 142729487.0
num_examples: 516177
download_size: 101890981
dataset_size: 142729487.0
- config_name: llama4
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 157182443.0
num_examples: 516177
download_size: 82734120
dataset_size: 157182443.0
- config_name: llama5
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 53373019.0
num_examples: 172059
download_size: 27923481
dataset_size: 53373019.0
- config_name: llama6
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 51480370.0
num_examples: 172059
download_size: 33775616
dataset_size: 51480370.0
- config_name: llama7
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 3759851.0
num_examples: 13530
download_size: 2287275
dataset_size: 3759851.0
- config_name: llama8
features:
- name: input
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 101496004.9890677
num_examples: 120441
- name: test
num_bytes: 43498649.0109323
num_examples: 51618
download_size: 74071830
dataset_size: 144994654.0
- config_name: phi2
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 275548292.0
num_examples: 1257591
download_size: 151999212
dataset_size: 275548292.0
- config_name: summary
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 1252702430.0
num_examples: 287113
download_size: 771120161
dataset_size: 1252702430.0
- config_name: summary2
features:
- name: document
dtype: string
- name: summary
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 1117818826.0
num_examples: 44972
download_size: 648248844
dataset_size: 1117818826.0
configs:
- config_name: articles
data_files:
- split: train
path: articles/train-*
- config_name: articles2
data_files:
- split: train
path: articles2/train-*
- config_name: llama
data_files:
- split: train
path: llama/train-*
- config_name: llama2
data_files:
- split: train
path: llama2/train-*
- config_name: llama3
data_files:
- split: train
path: llama3/train-*
- config_name: llama4
data_files:
- split: train
path: llama4/train-*
- config_name: llama5
data_files:
- split: train
path: llama5/train-*
- config_name: llama6
data_files:
- split: train
path: llama6/train-*
- config_name: llama7
data_files:
- split: train
path: llama7/train-*
- config_name: llama8
data_files:
- split: train
path: llama8/train-*
- split: test
path: llama8/test-*
- config_name: phi2
data_files:
- split: train
path: phi2/train-*
- config_name: summary
data_files:
- split: train
path: summary/train-*
- config_name: summary2
data_files:
- split: train
path: summary2/train-*
---
|
yejeekang/legal_cn_instruction | ---
license: afl-3.0
---
|
claudiostu/voz4 | ---
license: openrail
---
|
zolak/twitter_dataset_50_1713176037 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 386466
num_examples: 957
download_size: 188285
dataset_size: 386466
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yezhengli9/wmt20-de-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 474099
num_examples: 785
download_size: 299351
dataset_size: 474099
---
# Dataset Card for "wmt20-de-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Braddy/rsicd_deduplicate_97 | ---
dataset_info:
features:
- name: filename
dtype: string
- name: captions
sequence: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 449844757.25
num_examples: 8734
- name: test
num_bytes: 60130512.375
num_examples: 1093
- name: valid
num_bytes: 57307918.25
num_examples: 1094
download_size: 528945035
dataset_size: 567283187.875
---
# Dataset Card for "rsicd_deduplicate_97"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-107000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 664494
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
data-store/Facebook-Comment-v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: results
struct:
- name: dangerous_content
dtype: string
- name: harassment
dtype: string
- name: hate_speech
dtype: string
- name: normal
dtype: string
- name: sexaully_explixit
dtype: string
splits:
- name: dangerous_content_dataset
num_bytes: 8423895.86985
num_examples: 31908
- name: harassment_dataset
num_bytes: 1545225.7279125
num_examples: 5853
- name: hate_speech_dataset
num_bytes: 4606636.5498625
num_examples: 17449
- name: sexaully_explixit_dataset
num_bytes: 662126.45235
num_examples: 2508
download_size: 11478399
dataset_size: 15237884.599975001
configs:
- config_name: default
data_files:
- split: dangerous_content_dataset
path: data/dangerous_content_dataset-*
- split: harassment_dataset
path: data/harassment_dataset-*
- split: hate_speech_dataset
path: data/hate_speech_dataset-*
- split: sexaully_explixit_dataset
path: data/sexaully_explixit_dataset-*
---
|
shields/catalan_commonvoice | ---
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
splits:
- name: train
num_bytes: 34635950777.0
num_examples: 905243
- name: validation
num_bytes: 652519005.0
num_examples: 16340
- name: test
num_bytes: 625225219.0
num_examples: 16340
download_size: 34496947979
dataset_size: 35913695001.0
---
# Dataset Card for "catalan_commonvoice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pixparse/cc12m-wds | ---
license: other
license_name: conceptual-12m
license_link: LICENSE
task_categories:
- image-to-text
size_categories:
- 10M<n<100M
---
# Dataset Card for Conceptual Captions 12M (CC12M)
## Dataset Description
- **Repository:** [Conceptual 12M repository](https://github.com/google-research-datasets/conceptual-12m)
- **Paper:** [Conceptual 12M: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts](https://arxiv.org/abs/2102.08981)
- **Point of Contact:** [Conceptual Captions e-mail](mailto:conceptual-captions@google.com)
### Dataset Summary
Conceptual 12M (CC12M) is a dataset with 12 million image-text pairs specifically meant to be used for visionand-language pre-training.
Its data collection pipeline is a relaxed version of the one used in Conceptual Captions 3M (CC3M).
### Usage
This instance of Conceptual Captions is in [webdataset](https://github.com/webdataset/webdataset/commits/main) .tar format. It can be used with webdataset library or upcoming releases of Hugging Face `datasets`.
...More Detail TBD
### Data Splits
This dataset was downloaded using img2dataset. Images resized on download if shortest edge > 512 to shortest edge = 512.
#### Train
* `cc12m-train-*.tar`
* Downloaded on 2021/18/22
* 2176 shards, 10968539 samples
## Additional Information
### Dataset Curators
Soravit Changpinyo, Piyush Sharma, Nan Ding and Radu Soricut.
### Licensing Information
The dataset may be freely used for any purpose, although acknowledgement of
Google LLC ("Google") as the data source would be appreciated. The dataset is
provided "AS IS" without any warranty, express or implied. Google disclaims all
liability for any damages, direct or indirect, resulting from the use of the
dataset.
### Citation Information
```bibtex
@inproceedings{changpinyo2021cc12m,
title = {{Conceptual 12M}: Pushing Web-Scale Image-Text Pre-Training To Recognize Long-Tail Visual Concepts},
author = {Changpinyo, Soravit and Sharma, Piyush and Ding, Nan and Soricut, Radu},
booktitle = {CVPR},
year = {2021},
}
``` |
enpaiva/qa-cultura-guarani_generated-answers | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: query
dtype: string
- name: base_answer
dtype: string
- name: generated_answer
dtype: string
- name: model
dtype: string
- name: prompt_used
dtype: string
- name: labels
dtype: string
splits:
- name: test
num_bytes: 162509
num_examples: 250
- name: train
num_bytes: 1739558
num_examples: 2746
download_size: 1078432
dataset_size: 1902067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DoSp/DomainSpeech | ---
dataset_info:
- config_name: Agriculture_Agricultural Biotechnology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143439038.0
num_examples: 300
download_size: 143297680
dataset_size: 143439038.0
- config_name: Agriculture_Agricultural Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138126833.0
num_examples: 300
download_size: 138014919
dataset_size: 138126833.0
- config_name: Agriculture_Agricultural Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143180625.0
num_examples: 300
download_size: 143050446
dataset_size: 143180625.0
- config_name: Agriculture_Agricultural Mechanization
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 154916533.0
num_examples: 300
download_size: 154747365
dataset_size: 154916533.0
- config_name: Agriculture_Animal Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146354369.0
num_examples: 300
download_size: 146220983
dataset_size: 146354369.0
- config_name: Agriculture_Crop Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143046061.0
num_examples: 300
download_size: 142880656
dataset_size: 143046061.0
- config_name: Agriculture_Entomology and Pesticides
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143552360.0
num_examples: 300
download_size: 143407167
dataset_size: 143552360.0
- config_name: Agriculture_Fisheries
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138944065.0
num_examples: 300
download_size: 138788871
dataset_size: 138944065.0
- config_name: Agriculture_Forestry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 140535848.0
num_examples: 300
download_size: 140392528
dataset_size: 140535848.0
- config_name: Agriculture_Horticulture
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147926282.0
num_examples: 300
download_size: 147791744
dataset_size: 147926282.0
- config_name: Agriculture_Plant Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123700367.0
num_examples: 300
download_size: 123597900
dataset_size: 123700367.0
- config_name: Agriculture_Poultry Production
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147073759.0
num_examples: 300
download_size: 146906099
dataset_size: 147073759.0
- config_name: Agriculture_Soil Sciences and Plant Nutrition
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127354046.0
num_examples: 300
download_size: 127256326
dataset_size: 127354046.0
- config_name: Agriculture_Soil and Water Engineering and Conservation
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134537041.0
num_examples: 300
download_size: 134387592
dataset_size: 134537041.0
- config_name: Arts Design_Arts
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119548638.0
num_examples: 300
download_size: 119440736
dataset_size: 119548638.0
- config_name: Arts Design_Design
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135083325.0
num_examples: 300
download_size: 134936083
dataset_size: 135083325.0
- config_name: Arts Design_Interior Architecture
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141126586.0
num_examples: 300
download_size: 140979090
dataset_size: 141126586.0
- config_name: Arts Design_Urban Planning
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147980852.0
num_examples: 300
download_size: 147794755
dataset_size: 147980852.0
- config_name: Business_Business Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121104401.0
num_examples: 300
download_size: 120968900
dataset_size: 121104401.0
- config_name: Business_Communications and Media Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123893864.0
num_examples: 300
download_size: 123794867
dataset_size: 123893864.0
- config_name: Business_Decision Science and Operations Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 117426723.0
num_examples: 300
download_size: 117317155
dataset_size: 117426723.0
- config_name: Business_Entrepreneurship
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129740439.0
num_examples: 300
download_size: 129590618
dataset_size: 129740439.0
- config_name: Business_Human Resource Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134109342.0
num_examples: 300
download_size: 133946610
dataset_size: 134109342.0
- config_name: Business_Marketing
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131082374.0
num_examples: 300
download_size: 130942488
dataset_size: 131082374.0
- config_name: Business_Public Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128436764.0
num_examples: 300
download_size: 128268709
dataset_size: 128436764.0
- config_name: Business_Strategic Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129705598.0
num_examples: 300
download_size: 129565676
dataset_size: 129705598.0
- config_name: Economics_Accounting and Finance
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130086798.0
num_examples: 300
download_size: 129970443
dataset_size: 130086798.0
- config_name: Economics_Banking and Insurance
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125576327.0
num_examples: 300
download_size: 125457196
dataset_size: 125576327.0
- config_name: Economics_Environmental Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 144396467.0
num_examples: 300
download_size: 144269317
dataset_size: 144396467.0
- config_name: Economics_Financial Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126345574.0
num_examples: 300
download_size: 126213407
dataset_size: 126345574.0
- config_name: Economics_International Trade
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129266847.0
num_examples: 300
download_size: 129131077
dataset_size: 129266847.0
- config_name: Education_Early Childhood Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134842546.0
num_examples: 300
download_size: 134669041
dataset_size: 134842546.0
- config_name: Education_Educational Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129139609.0
num_examples: 300
download_size: 129009495
dataset_size: 129139609.0
- config_name: Education_Educational Psychology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132445380.0
num_examples: 300
download_size: 132314227
dataset_size: 132445380.0
- config_name: Education_Educational Technology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136349543.0
num_examples: 300
download_size: 136233919
dataset_size: 136349543.0
- config_name: Education_Elemantary Teacher Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128929721.0
num_examples: 300
download_size: 128832448
dataset_size: 128929721.0
- config_name: Education_Foreign Language Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132729799.0
num_examples: 300
download_size: 132576098
dataset_size: 132729799.0
- config_name: Education_Guidance and Counseling
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137961853.0
num_examples: 300
download_size: 137814518
dataset_size: 137961853.0
- config_name: Education_Mathematics and Science Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134215509.0
num_examples: 300
download_size: 134099723
dataset_size: 134215509.0
- config_name: Education_Physical Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132937777.0
num_examples: 300
download_size: 132805858
dataset_size: 132937777.0
- config_name: Education_Sociology of Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124285485.0
num_examples: 300
download_size: 124176688
dataset_size: 124285485.0
- config_name: Education_Special Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 152289384.0
num_examples: 300
download_size: 152131422
dataset_size: 152289384.0
- config_name: Engineering_Aerospace Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124292138.0
num_examples: 300
download_size: 124191922
dataset_size: 124292138.0
- config_name: Engineering_Automotive Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143846463.0
num_examples: 300
download_size: 143708257
dataset_size: 143846463.0
- config_name: Engineering_Bioengineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143137978.0
num_examples: 300
download_size: 143012457
dataset_size: 143137978.0
- config_name: Engineering_Biomaterials and Tissue Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137146975.0
num_examples: 300
download_size: 137025731
dataset_size: 137146975.0
- config_name: Engineering_Biomedical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131378195.0
num_examples: 300
download_size: 131261573
dataset_size: 131378195.0
- config_name: Engineering_Chemical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143133003.0
num_examples: 300
download_size: 143008061
dataset_size: 143133003.0
- config_name: Engineering_Civil Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130465075.0
num_examples: 300
download_size: 130356251
dataset_size: 130465075.0
- config_name: Engineering_Computer Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132679470.0
num_examples: 300
download_size: 132529121
dataset_size: 132679470.0
- config_name: Engineering_Earth Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119846962.0
num_examples: 300
download_size: 119730185
dataset_size: 119846962.0
- config_name: Engineering_Electrical and Electronic Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126520050.0
num_examples: 300
download_size: 126360752
dataset_size: 126520050.0
- config_name: Engineering_Electrical and Information Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123849397.0
num_examples: 300
download_size: 123716265
dataset_size: 123849397.0
- config_name: Engineering_Energy Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137784439.0
num_examples: 300
download_size: 137683801
dataset_size: 137784439.0
- config_name: Engineering_Environmental Science and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137198399.0
num_examples: 300
download_size: 137059643
dataset_size: 137198399.0
- config_name: Engineering_Food Science and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133611502.0
num_examples: 300
download_size: 133484623
dataset_size: 133611502.0
- config_name: Engineering_Geomatics Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129068429.0
num_examples: 300
download_size: 128978145
dataset_size: 129068429.0
- config_name: Engineering_Industrial and Manufacturing Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 122429186.0
num_examples: 300
download_size: 122322658
dataset_size: 122429186.0
- config_name: Engineering_Marine Sciences and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132973282.0
num_examples: 300
download_size: 132860408
dataset_size: 132973282.0
- config_name: Engineering_Mechanical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135364923.0
num_examples: 300
download_size: 135221594
dataset_size: 135364923.0
- config_name: Engineering_Mechatronics Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126449973.0
num_examples: 300
download_size: 126341559
dataset_size: 126449973.0
- config_name: Engineering_Metallurgical and Materials Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124292613.0
num_examples: 300
download_size: 124165732
dataset_size: 124292613.0
- config_name: Engineering_Meteorology and Atmospheric Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 120671090.0
num_examples: 300
download_size: 120549799
dataset_size: 120671090.0
- config_name: Engineering_Mining Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133000100.0
num_examples: 300
download_size: 132898319
dataset_size: 133000100.0
- config_name: Engineering_Nanoscience and Nanotechnology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126720028.0
num_examples: 300
download_size: 126601451
dataset_size: 126720028.0
- config_name: Engineering_Nuclear Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138378246.0
num_examples: 300
download_size: 138263608
dataset_size: 138378246.0
- config_name: Engineering_Petroleum Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131247557.0
num_examples: 300
download_size: 131121220
dataset_size: 131247557.0
- config_name: Engineering_Textile Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138330600.0
num_examples: 300
download_size: 138157500
dataset_size: 138330600.0
- config_name: History_History
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130253621.0
num_examples: 300
download_size: 130146337
dataset_size: 130253621.0
- config_name: Law_Business Corporate Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132833176.0
num_examples: 300
download_size: 132657300
dataset_size: 132833176.0
- config_name: Law_Civil Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 120799613.0
num_examples: 300
download_size: 120705948
dataset_size: 120799613.0
- config_name: Law_Constitutional Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124263458.0
num_examples: 300
download_size: 124147786
dataset_size: 124263458.0
- config_name: Law_Criminal Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125936929.0
num_examples: 300
download_size: 125829464
dataset_size: 125936929.0
- config_name: Law_Employment Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132215591.0
num_examples: 300
download_size: 132097839
dataset_size: 132215591.0
- config_name: Law_Environmental Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141112457.0
num_examples: 300
download_size: 140980187
dataset_size: 141112457.0
- config_name: Law_European Union Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134430087.0
num_examples: 300
download_size: 134291260
dataset_size: 134430087.0
- config_name: Law_International Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132972818.0
num_examples: 300
download_size: 132822729
dataset_size: 132972818.0
- config_name: Law_Law and Legal Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124902845.0
num_examples: 300
download_size: 124767772
dataset_size: 124902845.0
- config_name: Law_Public Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119886102.0
num_examples: 300
download_size: 119768166
dataset_size: 119886102.0
- config_name: Law_Tax Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126528701.0
num_examples: 300
download_size: 126415023
dataset_size: 126528701.0
- config_name: Medical Sciences_Anatomy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124345096.0
num_examples: 300
download_size: 124253091
dataset_size: 124345096.0
- config_name: Medical Sciences_Anesthesiology and Reanimation
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129149763.0
num_examples: 300
download_size: 129028143
dataset_size: 129149763.0
- config_name: Medical Sciences_Audiology and Speech Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134675137.0
num_examples: 300
download_size: 134564783
dataset_size: 134675137.0
- config_name: Medical Sciences_Bacteriology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129314886.0
num_examples: 300
download_size: 129190011
dataset_size: 129314886.0
- config_name: Medical Sciences_Biochemistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125011940.0
num_examples: 300
download_size: 124932996
dataset_size: 125011940.0
- config_name: Medical Sciences_Biophysics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126020992.0
num_examples: 300
download_size: 125897336
dataset_size: 126020992.0
- config_name: Medical Sciences_Biostatistics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118651656.0
num_examples: 300
download_size: 118574377
dataset_size: 118651656.0
- config_name: Medical Sciences_Cardiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135302197.0
num_examples: 300
download_size: 135193717
dataset_size: 135302197.0
- config_name: Medical Sciences_Cardiovascular Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137987783.0
num_examples: 300
download_size: 137879610
dataset_size: 137987783.0
- config_name: Medical Sciences_Chest Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131629091.0
num_examples: 300
download_size: 131486615
dataset_size: 131629091.0
- config_name: Medical Sciences_Child and Adolescent Psychiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 152654204.0
num_examples: 300
download_size: 152523834
dataset_size: 152654204.0
- config_name: Medical Sciences_Clinical Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133021566.0
num_examples: 300
download_size: 132912535
dataset_size: 133021566.0
- config_name: Medical Sciences_Dentistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135479829.0
num_examples: 300
download_size: 135352775
dataset_size: 135479829.0
- config_name: Medical Sciences_Dermatology and Venereology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125724038.0
num_examples: 300
download_size: 125637034
dataset_size: 125724038.0
- config_name: Medical Sciences_Emergency Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135705901.0
num_examples: 300
download_size: 135572579
dataset_size: 135705901.0
- config_name: Medical Sciences_Endocrinology and Metabolism
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136547926.0
num_examples: 300
download_size: 136424174
dataset_size: 136547926.0
- config_name: Medical Sciences_Epidemiology and Public Health
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 122443760.0
num_examples: 300
download_size: 122331509
dataset_size: 122443760.0
- config_name: Medical Sciences_Family Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147162214.0
num_examples: 300
download_size: 147018769
dataset_size: 147162214.0
- config_name: Medical Sciences_Forensic Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135621041.0
num_examples: 300
download_size: 135465069
dataset_size: 135621041.0
- config_name: Medical Sciences_Gastroenterology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137843323.0
num_examples: 300
download_size: 137726037
dataset_size: 137843323.0
- config_name: Medical Sciences_General Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124773122.0
num_examples: 300
download_size: 124665167
dataset_size: 124773122.0
- config_name: Medical Sciences_Geriatrics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 149601165.0
num_examples: 300
download_size: 149441668
dataset_size: 149601165.0
- config_name: Medical Sciences_Health Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137277345.0
num_examples: 300
download_size: 137127990
dataset_size: 137277345.0
- config_name: Medical Sciences_Health Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132340082.0
num_examples: 300
download_size: 132191040
dataset_size: 132340082.0
- config_name: Medical Sciences_Hematology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137161132.0
num_examples: 300
download_size: 137001185
dataset_size: 137161132.0
- config_name: Medical Sciences_Histology and Embriology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118029910.0
num_examples: 300
download_size: 117960878
dataset_size: 118029910.0
- config_name: Medical Sciences_Immunology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147571192.0
num_examples: 300
download_size: 147439785
dataset_size: 147571192.0
- config_name: Medical Sciences_Infectious Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130628555.0
num_examples: 300
download_size: 130515362
dataset_size: 130628555.0
- config_name: Medical Sciences_Internal Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132341319.0
num_examples: 300
download_size: 132242597
dataset_size: 132341319.0
- config_name: Medical Sciences_Medical Biochemistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141321514.0
num_examples: 300
download_size: 141192803
dataset_size: 141321514.0
- config_name: Medical Sciences_Medical Biology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123713781.0
num_examples: 300
download_size: 123626323
dataset_size: 123713781.0
- config_name: Medical Sciences_Medical Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130348018.0
num_examples: 300
download_size: 130247442
dataset_size: 130348018.0
- config_name: Medical Sciences_Medical Genetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132739285.0
num_examples: 300
download_size: 132620709
dataset_size: 132739285.0
- config_name: Medical Sciences_Medical Microbiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131818843.0
num_examples: 300
download_size: 131710880
dataset_size: 131818843.0
- config_name: Medical Sciences_Medical Oncology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132891133.0
num_examples: 300
download_size: 132742137
dataset_size: 132891133.0
- config_name: Medical Sciences_Medical Parasitology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127638224.0
num_examples: 300
download_size: 127533891
dataset_size: 127638224.0
- config_name: Medical Sciences_Medical Physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128012792.0
num_examples: 300
download_size: 127907099
dataset_size: 128012792.0
- config_name: Medical Sciences_Medical Physiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123009232.0
num_examples: 300
download_size: 122906320
dataset_size: 123009232.0
- config_name: Medical Sciences_Medical Virology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129423629.0
num_examples: 300
download_size: 129321752
dataset_size: 129423629.0
- config_name: Medical Sciences_Microbiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133143959.0
num_examples: 300
download_size: 132988663
dataset_size: 133143959.0
- config_name: Medical Sciences_Molecular Biology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127464967.0
num_examples: 300
download_size: 127337963
dataset_size: 127464967.0
- config_name: Medical Sciences_Mycology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137823673.0
num_examples: 300
download_size: 137708636
dataset_size: 137823673.0
- config_name: Medical Sciences_Neonatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141049258.0
num_examples: 300
download_size: 140933138
dataset_size: 141049258.0
- config_name: Medical Sciences_Nephrology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133628216.0
num_examples: 300
download_size: 133504498
dataset_size: 133628216.0
- config_name: Medical Sciences_Neurology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136508584.0
num_examples: 300
download_size: 136386376
dataset_size: 136508584.0
- config_name: Medical Sciences_Neuroscience
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126214227.0
num_examples: 300
download_size: 126138247
dataset_size: 126214227.0
- config_name: Medical Sciences_Neurosurgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139598205.0
num_examples: 300
download_size: 139459556
dataset_size: 139598205.0
- config_name: Medical Sciences_Nuclear Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141475957.0
num_examples: 300
download_size: 141349187
dataset_size: 141475957.0
- config_name: Medical Sciences_Nursing and Midwifery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125067849.0
num_examples: 300
download_size: 124961824
dataset_size: 125067849.0
- config_name: Medical Sciences_Nutrition and Dietetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137298930.0
num_examples: 300
download_size: 137177542
dataset_size: 137298930.0
- config_name: Medical Sciences_Obstetrics and Gynecology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139462396.0
num_examples: 300
download_size: 139346196
dataset_size: 139462396.0
- config_name: Medical Sciences_Occupational Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139789686.0
num_examples: 300
download_size: 139663646
dataset_size: 139789686.0
- config_name: Medical Sciences_Ophthalmology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128256576.0
num_examples: 300
download_size: 128137213
dataset_size: 128256576.0
- config_name: Medical Sciences_Optometry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124158526.0
num_examples: 300
download_size: 124043338
dataset_size: 124158526.0
- config_name: Medical Sciences_Orthopedics and Traumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124954258.0
num_examples: 300
download_size: 124839699
dataset_size: 124954258.0
- config_name: Medical Sciences_Otorhinolaryngology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118568192.0
num_examples: 300
download_size: 118469263
dataset_size: 118568192.0
- config_name: Medical Sciences_Parasitology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128606032.0
num_examples: 300
download_size: 128481740
dataset_size: 128606032.0
- config_name: Medical Sciences_Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136361718.0
num_examples: 300
download_size: 136219475
dataset_size: 136361718.0
- config_name: Medical Sciences_Pediatric Cardiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125106812.0
num_examples: 300
download_size: 125019625
dataset_size: 125106812.0
- config_name: Medical Sciences_Pediatric Endocrinology and Metabolism
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133790952.0
num_examples: 300
download_size: 133675104
dataset_size: 133790952.0
- config_name: Medical Sciences_Pediatric Gastroenterology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129939533.0
num_examples: 300
download_size: 129818254
dataset_size: 129939533.0
- config_name: Medical Sciences_Pediatric Hematology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130557879.0
num_examples: 300
download_size: 130455018
dataset_size: 130557879.0
- config_name: Medical Sciences_Pediatric Immunology and Allergy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124548519.0
num_examples: 300
download_size: 124454909
dataset_size: 124548519.0
- config_name: Medical Sciences_Pediatric Infectious Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129885463.0
num_examples: 300
download_size: 129772398
dataset_size: 129885463.0
- config_name: Medical Sciences_Pediatric Intensive Care
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136008333.0
num_examples: 300
download_size: 135876113
dataset_size: 136008333.0
- config_name: Medical Sciences_Pediatric Nephrology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133539276.0
num_examples: 300
download_size: 133420904
dataset_size: 133539276.0
- config_name: Medical Sciences_Pediatric Neurology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130006445.0
num_examples: 300
download_size: 129883565
dataset_size: 130006445.0
- config_name: Medical Sciences_Pediatric Pulmonology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131918311.0
num_examples: 300
download_size: 131790321
dataset_size: 131918311.0
- config_name: Medical Sciences_Pediatric Rheumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141173770.0
num_examples: 300
download_size: 141048082
dataset_size: 141173770.0
- config_name: Medical Sciences_Pediatric Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129573172.0
num_examples: 300
download_size: 129467025
dataset_size: 129573172.0
- config_name: Medical Sciences_Pediatrics and Child Health
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142513323.0
num_examples: 300
download_size: 142398544
dataset_size: 142513323.0
- config_name: Medical Sciences_Perinatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143238723.0
num_examples: 300
download_size: 143075573
dataset_size: 143238723.0
- config_name: Medical Sciences_Pharmacology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131266646.0
num_examples: 300
download_size: 131140692
dataset_size: 131266646.0
- config_name: Medical Sciences_Pharmacy & Pharmaceutical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123536721.0
num_examples: 300
download_size: 123432708
dataset_size: 123536721.0
- config_name: Medical Sciences_Physical Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138883982.0
num_examples: 300
download_size: 138766735
dataset_size: 138883982.0
- config_name: Medical Sciences_Physiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129536853.0
num_examples: 300
download_size: 129405940
dataset_size: 129536853.0
- config_name: Medical Sciences_Physiotherapy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142691474.0
num_examples: 300
download_size: 142563292
dataset_size: 142691474.0
- config_name: Medical Sciences_Plastic Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131666178.0
num_examples: 300
download_size: 131555009
dataset_size: 131666178.0
- config_name: Medical Sciences_Podiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130451437.0
num_examples: 300
download_size: 130325455
dataset_size: 130451437.0
- config_name: Medical Sciences_Psychiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137513120.0
num_examples: 300
download_size: 137383527
dataset_size: 137513120.0
- config_name: Medical Sciences_Radiation Oncology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146934885.0
num_examples: 300
download_size: 146815433
dataset_size: 146934885.0
- config_name: Medical Sciences_Radiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 148168300.0
num_examples: 300
download_size: 148016600
dataset_size: 148168300.0
- config_name: Medical Sciences_Rheumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134954977.0
num_examples: 300
download_size: 134841511
dataset_size: 134954977.0
- config_name: Medical Sciences_Sport Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127576853.0
num_examples: 300
download_size: 127455316
dataset_size: 127576853.0
- config_name: Medical Sciences_Sports Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135083531.0
num_examples: 300
download_size: 134931348
dataset_size: 135083531.0
- config_name: Medical Sciences_Thoracic Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135906719.0
num_examples: 300
download_size: 135778944
dataset_size: 135906719.0
- config_name: Medical Sciences_Urology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135596805.0
num_examples: 300
download_size: 135473770
dataset_size: 135596805.0
- config_name: Medical Sciences_Veterinary Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135858075.0
num_examples: 300
download_size: 135730165
dataset_size: 135858075.0
- config_name: Medical Sciences_Virology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127937723.0
num_examples: 300
download_size: 127838000
dataset_size: 127937723.0
- config_name: Natural Sciences_Applied physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126350419.0
num_examples: 300
download_size: 126248052
dataset_size: 126350419.0
- config_name: Natural Sciences_Astrophysics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129300703.0
num_examples: 300
download_size: 129158168
dataset_size: 129300703.0
- config_name: Natural Sciences_Atomic, Molecular and Optical physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 111687416.0
num_examples: 300
download_size: 111582196
dataset_size: 111687416.0
- config_name: Natural Sciences_Biological Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126050128.0
num_examples: 300
download_size: 125945290
dataset_size: 126050128.0
- config_name: Natural Sciences_Chemical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125925185.0
num_examples: 300
download_size: 125809833
dataset_size: 125925185.0
- config_name: Natural Sciences_Condensed matter physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119880781.0
num_examples: 300
download_size: 119762462
dataset_size: 119880781.0
- config_name: Natural Sciences_Geography
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127678573.0
num_examples: 300
download_size: 127551992
dataset_size: 127678573.0
- config_name: Natural Sciences_Mathematical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118078153.0
num_examples: 300
download_size: 117964811
dataset_size: 118078153.0
- config_name: Natural Sciences_Molecular Biology and Genetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 112294561.0
num_examples: 300
download_size: 112198712
dataset_size: 112294561.0
- config_name: Natural Sciences_Nuclear and Particle Physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121217938.0
num_examples: 300
download_size: 121108176
dataset_size: 121217938.0
- config_name: Philosophy_Philosophy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118345587.0
num_examples: 300
download_size: 118229918
dataset_size: 118345587.0
- config_name: Social Sciences_Anthropology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128840376.0
num_examples: 300
download_size: 128696216
dataset_size: 128840376.0
- config_name: Social Sciences_Archeology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118321559.0
num_examples: 300
download_size: 118206487
dataset_size: 118321559.0
- config_name: Social Sciences_Child Development
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126576147.0
num_examples: 300
download_size: 126464165
dataset_size: 126576147.0
- config_name: Social Sciences_Demography
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132052357.0
num_examples: 300
download_size: 131901043
dataset_size: 132052357.0
- config_name: Social Sciences_Higher Education Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141786814.0
num_examples: 300
download_size: 141661233
dataset_size: 141786814.0
- config_name: Social Sciences_Housing
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146169123.0
num_examples: 300
download_size: 146033728
dataset_size: 146169123.0
- config_name: Social Sciences_International Relations
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133839740.0
num_examples: 300
download_size: 133676984
dataset_size: 133839740.0
- config_name: Social Sciences_Library and Information Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123726092.0
num_examples: 300
download_size: 123594991
dataset_size: 123726092.0
- config_name: Social Sciences_Linguistics and Literature
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 114704654.0
num_examples: 300
download_size: 114595695
dataset_size: 114704654.0
- config_name: Social Sciences_Open and Distance Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143105156.0
num_examples: 300
download_size: 142956652
dataset_size: 143105156.0
- config_name: Social Sciences_Political Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121094624.0
num_examples: 300
download_size: 120963345
dataset_size: 121094624.0
- config_name: Social Sciences_Psychology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136275316.0
num_examples: 300
download_size: 136139111
dataset_size: 136275316.0
- config_name: Social Sciences_Regional Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124353132.0
num_examples: 300
download_size: 124243486
dataset_size: 124353132.0
- config_name: Social Sciences_Social Policy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134904666.0
num_examples: 300
download_size: 134753980
dataset_size: 134904666.0
- config_name: Social Sciences_Social Work
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134077053.0
num_examples: 300
download_size: 133967130
dataset_size: 134077053.0
- config_name: Social Sciences_Sociology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133329557.0
num_examples: 300
download_size: 133180184
dataset_size: 133329557.0
- config_name: Social Sciences_Tourism and Hospitality
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142262231.0
num_examples: 300
download_size: 142100591
dataset_size: 142262231.0
- config_name: Social Sciences_Transportation Science and Technology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 140265612.0
num_examples: 300
download_size: 140124964
dataset_size: 140265612.0
- config_name: Theology_Theology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 115449408.0
num_examples: 300
download_size: 115356333
dataset_size: 115449408.0
- config_name: testing
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 115449370.0
num_examples: 300
download_size: 115356390
dataset_size: 115449370.0
configs:
- config_name: Agriculture_Agricultural Biotechnology
data_files:
- split: test
path: content/Agriculture/Agricultural Biotechnology/test-*
- config_name: Agriculture_Agricultural Economics
data_files:
- split: test
path: content/Agriculture/Agricultural Economics/test-*
- config_name: Agriculture_Agricultural Engineering
data_files:
- split: test
path: content/Agriculture/Agricultural Engineering/test-*
- config_name: Agriculture_Agricultural Mechanization
data_files:
- split: test
path: content/Agriculture/Agricultural Mechanization/test-*
- config_name: Agriculture_Animal Science
data_files:
- split: test
path: content/Agriculture/Animal Science/test-*
- config_name: Agriculture_Crop Science
data_files:
- split: test
path: content/Agriculture/Crop Science/test-*
- config_name: Agriculture_Entomology and Pesticides
data_files:
- split: test
path: content/Agriculture/Entomology and Pesticides/test-*
- config_name: Agriculture_Fisheries
data_files:
- split: test
path: content/Agriculture/Fisheries/test-*
- config_name: Agriculture_Forestry
data_files:
- split: test
path: content/Agriculture/Forestry/test-*
- config_name: Agriculture_Horticulture
data_files:
- split: test
path: content/Agriculture/Horticulture/test-*
- config_name: Agriculture_Plant Science
data_files:
- split: test
path: content/Agriculture/Plant Science/test-*
- config_name: Agriculture_Poultry Production
data_files:
- split: test
path: content/Agriculture/Poultry Production/test-*
- config_name: Agriculture_Soil Sciences and Plant Nutrition
data_files:
- split: test
path: content/Agriculture/Soil Sciences and Plant Nutrition/test-*
- config_name: Agriculture_Soil and Water Engineering and Conservation
data_files:
- split: test
path: content/Agriculture/Soil and Water Engineering and Conservation/test-*
- config_name: Arts Design_Arts
data_files:
- split: test
path: content/Arts Design/Arts/test-*
- config_name: Arts Design_Design
data_files:
- split: test
path: content/Arts Design/Design/test-*
- config_name: Arts Design_Interior Architecture
data_files:
- split: test
path: content/Arts Design/Interior Architecture/test-*
- config_name: Arts Design_Urban Planning
data_files:
- split: test
path: content/Arts Design/Urban Planning/test-*
- config_name: Business_Business Administration
data_files:
- split: test
path: content/Business/Business Administration/test-*
- config_name: Business_Communications and Media Studies
data_files:
- split: test
path: content/Business/Communications and Media Studies/test-*
- config_name: Business_Decision Science and Operations Management
data_files:
- split: test
path: content/Business/Decision Science and Operations Management/test-*
- config_name: Business_Entrepreneurship
data_files:
- split: test
path: content/Business/Entrepreneurship/test-*
- config_name: Business_Human Resource Management
data_files:
- split: test
path: content/Business/Human Resource Management/test-*
- config_name: Business_Marketing
data_files:
- split: test
path: content/Business/Marketing/test-*
- config_name: Business_Public Administration
data_files:
- split: test
path: content/Business/Public Administration/test-*
- config_name: Business_Strategic Management
data_files:
- split: test
path: content/Business/Strategic Management/test-*
- config_name: Economics_Accounting and Finance
data_files:
- split: test
path: content/Economics/Accounting and Finance/test-*
- config_name: Economics_Banking and Insurance
data_files:
- split: test
path: content/Economics/Banking and Insurance/test-*
- config_name: Economics_Environmental Economics
data_files:
- split: test
path: content/Economics/Environmental Economics/test-*
- config_name: Economics_Financial Economics
data_files:
- split: test
path: content/Economics/Financial Economics/test-*
- config_name: Economics_International Trade
data_files:
- split: test
path: content/Economics/International Trade/test-*
- config_name: Education_Early Childhood Education
data_files:
- split: test
path: content/Education/Early Childhood Education/test-*
- config_name: Education_Educational Administration
data_files:
- split: test
path: content/Education/Educational Administration/test-*
- config_name: Education_Educational Psychology
data_files:
- split: test
path: content/Education/Educational Psychology/test-*
- config_name: Education_Educational Technology
data_files:
- split: test
path: content/Education/Educational Technology/test-*
- config_name: Education_Elemantary Teacher Education
data_files:
- split: test
path: content/Education/Elemantary Teacher Education/test-*
- config_name: Education_Foreign Language Education
data_files:
- split: test
path: content/Education/Foreign Language Education/test-*
- config_name: Education_Guidance and Counseling
data_files:
- split: test
path: content/Education/Guidance and Counseling/test-*
- config_name: Education_Mathematics and Science Education
data_files:
- split: test
path: content/Education/Mathematics and Science Education/test-*
- config_name: Education_Physical Education
data_files:
- split: test
path: content/Education/Physical Education/test-*
- config_name: Education_Sociology of Education
data_files:
- split: test
path: content/Education/Sociology of Education/test-*
- config_name: Education_Special Education
data_files:
- split: test
path: content/Education/Special Education/test-*
- config_name: Engineering_Aerospace Engineering
data_files:
- split: test
path: content/Engineering/Aerospace Engineering/test-*
- config_name: Engineering_Automotive Engineering
data_files:
- split: test
path: content/Engineering/Automotive Engineering/test-*
- config_name: Engineering_Bioengineering
data_files:
- split: test
path: content/Engineering/Bioengineering/test-*
- config_name: Engineering_Biomaterials and Tissue Engineering
data_files:
- split: test
path: content/Engineering/Biomaterials and Tissue Engineering/test-*
- config_name: Engineering_Biomedical Engineering
data_files:
- split: test
path: content/Engineering/Biomedical Engineering/test-*
- config_name: Engineering_Chemical Engineering
data_files:
- split: test
path: content/Engineering/Chemical Engineering/test-*
- config_name: Engineering_Civil Engineering
data_files:
- split: test
path: content/Engineering/Civil Engineering/test-*
- config_name: Engineering_Computer Science
data_files:
- split: test
path: content/Engineering/Computer Science/test-*
- config_name: Engineering_Earth Sciences
data_files:
- split: test
path: content/Engineering/Earth Sciences/test-*
- config_name: Engineering_Electrical and Electronic Engineering
data_files:
- split: test
path: content/Engineering/Electrical and Electronic Engineering/test-*
- config_name: Engineering_Electrical and Information Engineering
data_files:
- split: test
path: content/Engineering/Electrical and Information Engineering/test-*
- config_name: Engineering_Energy Engineering
data_files:
- split: test
path: content/Engineering/Energy Engineering/test-*
- config_name: Engineering_Environmental Science and Engineering
data_files:
- split: test
path: content/Engineering/Environmental Science and Engineering/test-*
- config_name: Engineering_Food Science and Engineering
data_files:
- split: test
path: content/Engineering/Food Science and Engineering/test-*
- config_name: Engineering_Geomatics Engineering
data_files:
- split: test
path: content/Engineering/Geomatics Engineering/test-*
- config_name: Engineering_Industrial and Manufacturing Engineering
data_files:
- split: test
path: content/Engineering/Industrial and Manufacturing Engineering/test-*
- config_name: Engineering_Marine Sciences and Engineering
data_files:
- split: test
path: content/Engineering/Marine Sciences and Engineering/test-*
- config_name: Engineering_Mechanical Engineering
data_files:
- split: test
path: content/Engineering/Mechanical Engineering/test-*
- config_name: Engineering_Mechatronics Engineering
data_files:
- split: test
path: content/Engineering/Mechatronics Engineering/test-*
- config_name: Engineering_Metallurgical and Materials Engineering
data_files:
- split: test
path: content/Engineering/Metallurgical and Materials Engineering/test-*
- config_name: Engineering_Meteorology and Atmospheric Sciences
data_files:
- split: test
path: content/Engineering/Meteorology and Atmospheric Sciences/test-*
- config_name: Engineering_Mining Engineering
data_files:
- split: test
path: content/Engineering/Mining Engineering/test-*
- config_name: Engineering_Nanoscience and Nanotechnology
data_files:
- split: test
path: content/Engineering/Nanoscience and Nanotechnology/test-*
- config_name: Engineering_Nuclear Engineering
data_files:
- split: test
path: content/Engineering/Nuclear Engineering/test-*
- config_name: Engineering_Petroleum Engineering
data_files:
- split: test
path: content/Engineering/Petroleum Engineering/test-*
- config_name: Engineering_Textile Engineering
data_files:
- split: test
path: content/Engineering/Textile Engineering/test-*
- config_name: History_History
data_files:
- split: test
path: content/History/History/test-*
- config_name: Law_Business Corporate Law
data_files:
- split: test
path: content/Law/Business Corporate Law/test-*
- config_name: Law_Civil Law
data_files:
- split: test
path: content/Law/Civil Law/test-*
- config_name: Law_Constitutional Law
data_files:
- split: test
path: content/Law/Constitutional Law/test-*
- config_name: Law_Criminal Law
data_files:
- split: test
path: content/Law/Criminal Law/test-*
- config_name: Law_Employment Law
data_files:
- split: test
path: content/Law/Employment Law/test-*
- config_name: Law_Environmental Law
data_files:
- split: test
path: content/Law/Environmental Law/test-*
- config_name: Law_European Union Law
data_files:
- split: test
path: content/Law/European Union Law/test-*
- config_name: Law_International Law
data_files:
- split: test
path: content/Law/International Law/test-*
- config_name: Law_Law and Legal Studies
data_files:
- split: test
path: content/Law/Law and Legal Studies/test-*
- config_name: Law_Public Law
data_files:
- split: test
path: content/Law/Public Law/test-*
- config_name: Law_Tax Law
data_files:
- split: test
path: content/Law/Tax Law/test-*
- config_name: Medical Sciences_Anatomy
data_files:
- split: test
path: content/Medical Sciences/Anatomy/test-*
- config_name: Medical Sciences_Anesthesiology and Reanimation
data_files:
- split: test
path: content/Medical Sciences/Anesthesiology and Reanimation/test-*
- config_name: Medical Sciences_Audiology and Speech Pathology
data_files:
- split: test
path: content/Medical Sciences/Audiology and Speech Pathology/test-*
- config_name: Medical Sciences_Bacteriology
data_files:
- split: test
path: content/Medical Sciences/Bacteriology/test-*
- config_name: Medical Sciences_Biochemistry
data_files:
- split: test
path: content/Medical Sciences/Biochemistry/test-*
- config_name: Medical Sciences_Biophysics
data_files:
- split: test
path: content/Medical Sciences/Biophysics/test-*
- config_name: Medical Sciences_Biostatistics
data_files:
- split: test
path: content/Medical Sciences/Biostatistics/test-*
- config_name: Medical Sciences_Cardiology
data_files:
- split: test
path: content/Medical Sciences/Cardiology/test-*
- config_name: Medical Sciences_Cardiovascular Surgery
data_files:
- split: test
path: content/Medical Sciences/Cardiovascular Surgery/test-*
- config_name: Medical Sciences_Chest Diseases
data_files:
- split: test
path: content/Medical Sciences/Chest Diseases/test-*
- config_name: Medical Sciences_Child and Adolescent Psychiatry
data_files:
- split: test
path: content/Medical Sciences/Child and Adolescent Psychiatry/test-*
- config_name: Medical Sciences_Clinical Pathology
data_files:
- split: test
path: content/Medical Sciences/Clinical Pathology/test-*
- config_name: Medical Sciences_Dentistry
data_files:
- split: test
path: content/Medical Sciences/Dentistry/test-*
- config_name: Medical Sciences_Dermatology and Venereology
data_files:
- split: test
path: content/Medical Sciences/Dermatology and Venereology/test-*
- config_name: Medical Sciences_Emergency Medicine
data_files:
- split: test
path: content/Medical Sciences/Emergency Medicine/test-*
- config_name: Medical Sciences_Endocrinology and Metabolism
data_files:
- split: test
path: content/Medical Sciences/Endocrinology and Metabolism/test-*
- config_name: Medical Sciences_Epidemiology and Public Health
data_files:
- split: test
path: content/Medical Sciences/Epidemiology and Public Health/test-*
- config_name: Medical Sciences_Family Medicine
data_files:
- split: test
path: content/Medical Sciences/Family Medicine/test-*
- config_name: Medical Sciences_Forensic Medicine
data_files:
- split: test
path: content/Medical Sciences/Forensic Medicine/test-*
- config_name: Medical Sciences_Gastroenterology
data_files:
- split: test
path: content/Medical Sciences/Gastroenterology/test-*
- config_name: Medical Sciences_General Surgery
data_files:
- split: test
path: content/Medical Sciences/General Surgery/test-*
- config_name: Medical Sciences_Geriatrics
data_files:
- split: test
path: content/Medical Sciences/Geriatrics/test-*
- config_name: Medical Sciences_Health Administration
data_files:
- split: test
path: content/Medical Sciences/Health Administration/test-*
- config_name: Medical Sciences_Health Sciences
data_files:
- split: test
path: content/Medical Sciences/Health Sciences/test-*
- config_name: Medical Sciences_Hematology
data_files:
- split: test
path: content/Medical Sciences/Hematology/test-*
- config_name: Medical Sciences_Histology and Embriology
data_files:
- split: test
path: content/Medical Sciences/Histology and Embriology/test-*
- config_name: Medical Sciences_Immunology
data_files:
- split: test
path: content/Medical Sciences/Immunology/test-*
- config_name: Medical Sciences_Infectious Diseases
data_files:
- split: test
path: content/Medical Sciences/Infectious Diseases/test-*
- config_name: Medical Sciences_Internal Medicine
data_files:
- split: test
path: content/Medical Sciences/Internal Medicine/test-*
- config_name: Medical Sciences_Medical Biochemistry
data_files:
- split: test
path: content/Medical Sciences/Medical Biochemistry/test-*
- config_name: Medical Sciences_Medical Biology
data_files:
- split: test
path: content/Medical Sciences/Medical Biology/test-*
- config_name: Medical Sciences_Medical Education
data_files:
- split: test
path: content/Medical Sciences/Medical Education/test-*
- config_name: Medical Sciences_Medical Genetics
data_files:
- split: test
path: content/Medical Sciences/Medical Genetics/test-*
- config_name: Medical Sciences_Medical Microbiology
data_files:
- split: test
path: content/Medical Sciences/Medical Microbiology/test-*
- config_name: Medical Sciences_Medical Oncology
data_files:
- split: test
path: content/Medical Sciences/Medical Oncology/test-*
- config_name: Medical Sciences_Medical Parasitology
data_files:
- split: test
path: content/Medical Sciences/Medical Parasitology/test-*
- config_name: Medical Sciences_Medical Physics
data_files:
- split: test
path: content/Medical Sciences/Medical Physics/test-*
- config_name: Medical Sciences_Medical Physiology
data_files:
- split: test
path: content/Medical Sciences/Medical Physiology/test-*
- config_name: Medical Sciences_Medical Virology
data_files:
- split: test
path: content/Medical Sciences/Medical Virology/test-*
- config_name: Medical Sciences_Microbiology
data_files:
- split: test
path: content/Medical Sciences/Microbiology/test-*
- config_name: Medical Sciences_Molecular Biology
data_files:
- split: test
path: content/Medical Sciences/Molecular Biology/test-*
- config_name: Medical Sciences_Mycology
data_files:
- split: test
path: content/Medical Sciences/Mycology/test-*
- config_name: Medical Sciences_Neonatology
data_files:
- split: test
path: content/Medical Sciences/Neonatology/test-*
- config_name: Medical Sciences_Nephrology
data_files:
- split: test
path: content/Medical Sciences/Nephrology/test-*
- config_name: Medical Sciences_Neurology
data_files:
- split: test
path: content/Medical Sciences/Neurology/test-*
- config_name: Medical Sciences_Neuroscience
data_files:
- split: test
path: content/Medical Sciences/Neuroscience/test-*
- config_name: Medical Sciences_Neurosurgery
data_files:
- split: test
path: content/Medical Sciences/Neurosurgery/test-*
- config_name: Medical Sciences_Nuclear Medicine
data_files:
- split: test
path: content/Medical Sciences/Nuclear Medicine/test-*
- config_name: Medical Sciences_Nursing and Midwifery
data_files:
- split: test
path: content/Medical Sciences/Nursing and Midwifery/test-*
- config_name: Medical Sciences_Nutrition and Dietetics
data_files:
- split: test
path: content/Medical Sciences/Nutrition and Dietetics/test-*
- config_name: Medical Sciences_Obstetrics and Gynecology
data_files:
- split: test
path: content/Medical Sciences/Obstetrics and Gynecology/test-*
- config_name: Medical Sciences_Occupational Medicine
data_files:
- split: test
path: content/Medical Sciences/Occupational Medicine/test-*
- config_name: Medical Sciences_Ophthalmology
data_files:
- split: test
path: content/Medical Sciences/Ophthalmology/test-*
- config_name: Medical Sciences_Optometry
data_files:
- split: test
path: content/Medical Sciences/Optometry/test-*
- config_name: Medical Sciences_Orthopedics and Traumatology
data_files:
- split: test
path: content/Medical Sciences/Orthopedics and Traumatology/test-*
- config_name: Medical Sciences_Otorhinolaryngology
data_files:
- split: test
path: content/Medical Sciences/Otorhinolaryngology/test-*
- config_name: Medical Sciences_Parasitology
data_files:
- split: test
path: content/Medical Sciences/Parasitology/test-*
- config_name: Medical Sciences_Pathology
data_files:
- split: test
path: content/Medical Sciences/Pathology/test-*
- config_name: Medical Sciences_Pediatric Cardiology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Cardiology/test-*
- config_name: Medical Sciences_Pediatric Endocrinology and Metabolism
data_files:
- split: test
path: content/Medical Sciences/Pediatric Endocrinology and Metabolism/test-*
- config_name: Medical Sciences_Pediatric Gastroenterology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Gastroenterology/test-*
- config_name: Medical Sciences_Pediatric Hematology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Hematology/test-*
- config_name: Medical Sciences_Pediatric Immunology and Allergy
data_files:
- split: test
path: content/Medical Sciences/Pediatric Immunology and Allergy/test-*
- config_name: Medical Sciences_Pediatric Infectious Diseases
data_files:
- split: test
path: content/Medical Sciences/Pediatric Infectious Diseases/test-*
- config_name: Medical Sciences_Pediatric Intensive Care
data_files:
- split: test
path: content/Medical Sciences/Pediatric Intensive Care/test-*
- config_name: Medical Sciences_Pediatric Nephrology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Nephrology/test-*
- config_name: Medical Sciences_Pediatric Neurology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Neurology/test-*
- config_name: Medical Sciences_Pediatric Pulmonology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Pulmonology/test-*
- config_name: Medical Sciences_Pediatric Rheumatology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Rheumatology/test-*
- config_name: Medical Sciences_Pediatric Surgery
data_files:
- split: test
path: content/Medical Sciences/Pediatric Surgery/test-*
- config_name: Medical Sciences_Pediatrics and Child Health
data_files:
- split: test
path: content/Medical Sciences/Pediatrics and Child Health/test-*
- config_name: Medical Sciences_Perinatology
data_files:
- split: test
path: content/Medical Sciences/Perinatology/test-*
- config_name: Medical Sciences_Pharmacology
data_files:
- split: test
path: content/Medical Sciences/Pharmacology/test-*
- config_name: Medical Sciences_Pharmacy & Pharmaceutical Sciences
data_files:
- split: test
path: content/Medical Sciences/Pharmacy & Pharmaceutical Sciences/test-*
- config_name: Medical Sciences_Physical Medicine
data_files:
- split: test
path: content/Medical Sciences/Physical Medicine/test-*
- config_name: Medical Sciences_Physiology
data_files:
- split: test
path: content/Medical Sciences/Physiology/test-*
- config_name: Medical Sciences_Physiotherapy
data_files:
- split: test
path: content/Medical Sciences/Physiotherapy/test-*
- config_name: Medical Sciences_Plastic Surgery
data_files:
- split: test
path: content/Medical Sciences/Plastic Surgery/test-*
- config_name: Medical Sciences_Podiatry
data_files:
- split: test
path: content/Medical Sciences/Podiatry/test-*
- config_name: Medical Sciences_Psychiatry
data_files:
- split: test
path: content/Medical Sciences/Psychiatry/test-*
- config_name: Medical Sciences_Radiation Oncology
data_files:
- split: test
path: content/Medical Sciences/Radiation Oncology/test-*
- config_name: Medical Sciences_Radiology
data_files:
- split: test
path: content/Medical Sciences/Radiology/test-*
- config_name: Medical Sciences_Rheumatology
data_files:
- split: test
path: content/Medical Sciences/Rheumatology/test-*
- config_name: Medical Sciences_Sport Science
data_files:
- split: test
path: content/Medical Sciences/Sport Science/test-*
- config_name: Medical Sciences_Sports Medicine
data_files:
- split: test
path: content/Medical Sciences/Sports Medicine/test-*
- config_name: Medical Sciences_Thoracic Surgery
data_files:
- split: test
path: content/Medical Sciences/Thoracic Surgery/test-*
- config_name: Medical Sciences_Urology
data_files:
- split: test
path: content/Medical Sciences/Urology/test-*
- config_name: Medical Sciences_Veterinary Sciences
data_files:
- split: test
path: content/Medical Sciences/Veterinary Sciences/test-*
- config_name: Medical Sciences_Virology
data_files:
- split: test
path: content/Medical Sciences/Virology/test-*
- config_name: Natural Sciences_Applied physics
data_files:
- split: test
path: content/Natural Sciences/Applied physics/test-*
- config_name: Natural Sciences_Astrophysics
data_files:
- split: test
path: content/Natural Sciences/Astrophysics/test-*
- config_name: Natural Sciences_Atomic, Molecular and Optical physics
data_files:
- split: test
path: content/Natural Sciences/Atomic, Molecular and Optical physics/test-*
- config_name: Natural Sciences_Biological Science
data_files:
- split: test
path: content/Natural Sciences/Biological Science/test-*
- config_name: Natural Sciences_Chemical Sciences
data_files:
- split: test
path: content/Natural Sciences/Chemical Sciences/test-*
- config_name: Natural Sciences_Condensed matter physics
data_files:
- split: test
path: content/Natural Sciences/Condensed matter physics/test-*
- config_name: Natural Sciences_Geography
data_files:
- split: test
path: content/Natural Sciences/Geography/test-*
- config_name: Natural Sciences_Mathematical Sciences
data_files:
- split: test
path: content/Natural Sciences/Mathematical Sciences/test-*
- config_name: Natural Sciences_Molecular Biology and Genetics
data_files:
- split: test
path: content/Natural Sciences/Molecular Biology and Genetics/test-*
- config_name: Natural Sciences_Nuclear and Particle Physics
data_files:
- split: test
path: content/Natural Sciences/Nuclear and Particle Physics/test-*
- config_name: Philosophy_Philosophy
data_files:
- split: test
path: content/Philosophy/Philosophy/test-*
- config_name: Social Sciences_Anthropology
data_files:
- split: test
path: content/Social Sciences/Anthropology/test-*
- config_name: Social Sciences_Archeology
data_files:
- split: test
path: content/Social Sciences/Archeology/test-*
- config_name: Social Sciences_Child Development
data_files:
- split: test
path: content/Social Sciences/Child Development/test-*
- config_name: Social Sciences_Demography
data_files:
- split: test
path: content/Social Sciences/Demography/test-*
- config_name: Social Sciences_Higher Education Studies
data_files:
- split: test
path: content/Social Sciences/Higher Education Studies/test-*
- config_name: Social Sciences_Housing
data_files:
- split: test
path: content/Social Sciences/Housing/test-*
- config_name: Social Sciences_International Relations
data_files:
- split: test
path: content/Social Sciences/International Relations/test-*
- config_name: Social Sciences_Library and Information Science
data_files:
- split: test
path: content/Social Sciences/Library and Information Science/test-*
- config_name: Social Sciences_Linguistics and Literature
data_files:
- split: test
path: content/Social Sciences/Linguistics and Literature/test-*
- config_name: Social Sciences_Open and Distance Education
data_files:
- split: test
path: content/Social Sciences/Open and Distance Education/test-*
- config_name: Social Sciences_Political Science
data_files:
- split: test
path: content/Social Sciences/Political Science/test-*
- config_name: Social Sciences_Psychology
data_files:
- split: test
path: content/Social Sciences/Psychology/test-*
- config_name: Social Sciences_Regional Studies
data_files:
- split: test
path: content/Social Sciences/Regional Studies/test-*
- config_name: Social Sciences_Social Policy
data_files:
- split: test
path: content/Social Sciences/Social Policy/test-*
- config_name: Social Sciences_Social Work
data_files:
- split: test
path: content/Social Sciences/Social Work/test-*
- config_name: Social Sciences_Sociology
data_files:
- split: test
path: content/Social Sciences/Sociology/test-*
- config_name: Social Sciences_Tourism and Hospitality
data_files:
- split: test
path: content/Social Sciences/Tourism and Hospitality/test-*
- config_name: Social Sciences_Transportation Science and Technology
data_files:
- split: test
path: content/Social Sciences/Transportation Science and Technology/test-*
- config_name: Theology_Theology
data_files:
- split: test
path: content/Theology/Theology/test-*
- config_name: testing
data_files:
- split: test
path: /content/testing/test-*
---
# Multi-domain academic audio data for evaluating ASR model
## Dataset Summary
This dataset, named "DomainSpeech," is meticulously curated to serve as a robust evaluation tool for Automatic Speech Recognition (ASR) models. Encompassing a broad spectrum of academic domains including Agriculture, Sciences, Engineering, and Business. A distinctive feature of this dataset is its deliberate design to present a more challenging benchmark by maintaining a technical terminology density of 20% across the texts. This parameter was set to elevate the complexity above the norm found in existing ASR model evaluation datasets, thereby rendering "DomainSpeech" an ideal candidate for validating the performance of ASR systems in recognizing domain-specific contents. The dataset's unique composition makes it a valuable asset for researchers and developers aiming to enhance the accuracy and reliability of ASR systems in academic and professional settings.
## Dataset Description
DomainSpeech is composed of 199 subsets, each contributing 300 rows of domain-specific English text data and corresponding 22050 Hz speech data. Each subset name takes a form as {domain}_{subdomain}. Although DomainSpeech mainly focuses on evaluation of ASR models, it also have extra 1500 rows for fine-tuning with some subdomains (Anatomy, Anthropology, Cardiology, Dentistry, Pathology).
## How to Use
To utilize the "DomainSpeech" dataset, especially focusing on a subset such as 'Medical Sciences_Anatomy,' you can follow the simple steps outlined below. This example demonstrates how to load the 'Medical Sciences_Anatomy' subset from the dataset for further analysis or model evaluation.
```python
from datasets import load_dataset
# Load the 'Medical Sciences_Anatomy' subset from the 'DomainSpeech' dataset
dataset = load_dataset("DoSp/DomainSpeech", "Medical Sciences_Anatomy")
```
## Evaluation Example
Can be found on our Paper "DomainSpeech: Domain Specific Corpus to Evaluate and Enhance ASR System"
| | Anatomy | Anthropology | Cardiology | Dentistry | Pathology |
| ----------------- | ----- | ----- | ----- | ----- | ----- |
| **Whisper-small** | - | - | - | - | - |
| **Baseline** | 9.19 | 9.19 | 13.25 | 9.76 | 11.92 |
| **T5-base** |8.49 | 7.15 | 9.7 | 8.60 | 11.16 |
| **Whisper-large-v2** | - | - | - | - | - |
| **Baseline** | 3.98 | 3.19 | 6.17 | 4.33 | 6.85 |
| **T5-base** | 3.84 | 4.31 | 4.34 | 4.00 | 7.83 |
|
adalib/evaluate-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 4744278
num_examples: 353
- name: test
num_bytes: 970506
num_examples: 81
download_size: 2076716
dataset_size: 5714784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
GoodBaiBai88/M3D-RefSeg | ---
license: apache-2.0
---
|
gokuls/wiki_book_corpus_processed_bert_dataset_medium | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 11100805200.0
num_examples: 3083557
download_size: 3272451599
dataset_size: 11100805200.0
---
# Dataset Card for "wiki_book_corpus_processed_bert_dataset_medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tippawan/test2-data-semi-trainulb-r5-WLV | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: prob
sequence: float64
- name: ifpass
sequence: int64
- name: pred
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 61178854
num_examples: 30461
download_size: 9518361
dataset_size: 61178854
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Salesforce/rose | ---
language:
- en
---
# ROSE 🌹
This repo contiains the RoSE benchmark of our paper "Revisiting the Gold Standard:
Grounding Summarization Evaluation with Robust Human Evaluation".
Please visit [here](https://yale-lily.github.io/ROSE/) for a demo page of this project.
### ACU Annotations
RoSE benchmark contains system outputs annotated with our ACU protocol.
It contains four parts:
- CNNDM, test set annotations
- CNNDM, validation set annotations
- XSum, test set annotations
- SamSum, test set annotations
We summarize the statistics below.
| Dataset | Split | #Doc. | #Sys. | #Total Summ. | HF Name
| --- | --- | --- | --- | --- | --- |
| CNNDM | Test | 500 | 12 | 6000 | `cnndm_test` |
| CNNDM | Validation | 1000 | 8 | 8000 | `cnndm_validation` |
| XSum | Test | 500 | 8 | 4000 | `xsum` |
| SamSum | Test | 500 | 8 | 4000 | `samsum` |
### Human Annotations with Different Evaluation Protocols
We have system outputs annotated with four different human evaluation protocols in total.
We summarize them below.
| Protocol | w/ Input Document | w/ Reference Summary | Fine-grained |
| --- | --- | --- | --- |
| Prior | ✗ | ✗ | ✗ |
| Ref-free | ✓ | ✗ | ✗ |
| Ref-based | ✗ | ✓ | ✗ |
| ACU | ✗ | ✓ | ✓ |
We annotated two sets of system summaries.
1. Summaries of 12 fine-tuned systems. The huggingface data split name is `cnndm_protocol`.
2. Zero-shot summaries from large langauge models (GPT3, T0), together with summaries from BRIO and BART. The huggingface data split name is `cnndm_protocol_gpt3`.
|
C0uchP0tat0/atomic_data | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 210870
num_examples: 50
download_size: 260325
dataset_size: 210870
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community | ---
pretty_name: Evaluation run of GOAT-AI/GOAT-7B-Community
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [GOAT-AI/GOAT-7B-Community](https://huggingface.co/GOAT-AI/GOAT-7B-Community)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:14:52.967997](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community/blob/main/results_2023-09-22T17-14-52.967997.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005243288590604027,\n\
\ \"em_stderr\": 0.0007396052260778031,\n \"f1\": 0.06909395973154382,\n\
\ \"f1_stderr\": 0.0015832414439852427,\n \"acc\": 0.3838492484021702,\n\
\ \"acc_stderr\": 0.009135888573374731\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.005243288590604027,\n \"em_stderr\": 0.0007396052260778031,\n\
\ \"f1\": 0.06909395973154382,\n \"f1_stderr\": 0.0015832414439852427\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \
\ \"acc_stderr\": 0.005693886131407048\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342414\n\
\ }\n}\n```"
repo_url: https://huggingface.co/GOAT-AI/GOAT-7B-Community
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|arc:challenge|25_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_14_52.967997
path:
- '**/details_harness|drop|3_2023-09-22T17-14-52.967997.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-14-52.967997.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_14_52.967997
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-14-52.967997.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-14-52.967997.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hellaswag|10_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T12:51:32.230763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T12:51:32.230763.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_14_52.967997
path:
- '**/details_harness|winogrande|5_2023-09-22T17-14-52.967997.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-14-52.967997.parquet'
- config_name: results
data_files:
- split: 2023_07_25T12_51_32.230763
path:
- results_2023-07-25T12:51:32.230763.parquet
- split: 2023_09_22T17_14_52.967997
path:
- results_2023-09-22T17-14-52.967997.parquet
- split: latest
path:
- results_2023-09-22T17-14-52.967997.parquet
---
# Dataset Card for Evaluation run of GOAT-AI/GOAT-7B-Community
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/GOAT-AI/GOAT-7B-Community
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [GOAT-AI/GOAT-7B-Community](https://huggingface.co/GOAT-AI/GOAT-7B-Community) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:14:52.967997](https://huggingface.co/datasets/open-llm-leaderboard/details_GOAT-AI__GOAT-7B-Community/blob/main/results_2023-09-22T17-14-52.967997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005243288590604027,
"em_stderr": 0.0007396052260778031,
"f1": 0.06909395973154382,
"f1_stderr": 0.0015832414439852427,
"acc": 0.3838492484021702,
"acc_stderr": 0.009135888573374731
},
"harness|drop|3": {
"em": 0.005243288590604027,
"em_stderr": 0.0007396052260778031,
"f1": 0.06909395973154382,
"f1_stderr": 0.0015832414439852427
},
"harness|gsm8k|5": {
"acc": 0.04473085670962851,
"acc_stderr": 0.005693886131407048
},
"harness|winogrande|5": {
"acc": 0.7229676400947119,
"acc_stderr": 0.012577891015342414
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Jiahuan/dst_en | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3879259
num_examples: 2535
- name: val
num_bytes: 1264245
num_examples: 830
- name: test
num_bytes: 2514258
num_examples: 1646
download_size: 286051
dataset_size: 7657762
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
JailbreakV-28K/JailBreakV-28k | ---
license: mit
task_categories:
- text-generation
- question-answering
pretty_name: JailBreakV_28K
size_categories:
- 10K<n<100K
configs:
- config_name: JailBreakV_28K
data_files:
- split: mini_JailBreakV_28K
path: JailBreakV_28K/mini_JailBreakV_28K.csv
- split: JailBreakV_28K
path: JailBreakV_28K/JailBreakV_28K.csv
- config_name: RedTeam_2K
data_files:
- split: RedTeam_2K
path: JailBreakV_28K/RedTeam_2K.csv
---
# ⛓💥 JailBreakV-28K: A Benchmark for Assessing the Robustness of MultiModal Large Language Models against Jailbreak Attacks
[**🌐 GitHub**](https://github.com/EddyLuo1232/JailBreakV_28K) | [**🛎 Project Page**](https://eddyluo1232.github.io/JailBreakV28K/) | [**👉 Download full datasets**](https://drive.google.com/file/d/1ZrvSHklXiGYhpiVoxUH8FWc5k0fv2xVZ/view?usp=drive_link)
## If you like our project, please give us a star ⭐ on Hugging Face for the latest update.
## 📥 Using our dataset via huggingface Dataset
```python
from datasets import load_dataset
mini_JailBreakV_28K = load_dataset("JailbreakV-28K/JailBreakV-28k", 'JailBreakV_28K')["mini_JailBreakV_28K"]
JailBreakV_28K = load_dataset("JailbreakV-28K/JailBreakV-28k", 'JailBreakV_28K')["JailBreakV_28K"]
RedTeam_2K = load_dataset("JailbreakV-28K/JailBreakV-28k", 'RedTeam_2K')["RedTeam_2K"]
```
## 👻 An Example of Inference and Evaluation on Bunny-v1_0
```python
pip install -r requirement.txt
python attack.py --root JailBreakV_28K # use Bunny-v1_0 as case in this script
python evaluate_llamaguard_ours.py --data_path ./results/Bunny-v1_0/JailBreakV_28K.csv
```
## 😃 Dataset Details
JailBreakV_28K and mini_JailBreakV_28K datasets will comprise the following columns:
- **id**: Unique identifier for all samples.
- **jailbreak_query**: Jailbreak_query obtained by different jailbreak attacks.
- **redteam_query**: Harmful query from RedTeam_2K.
- **format**: Jailbreak attack method including template, persuade, logic, figstep, query-relevant.
- **policy**: The safety policy that redteam_query against.
- **image_path**: The file path of the image.
- **from**: The source of data.
- **selected_mini**: "True" if the data in mini_JailBreakV_28K dataset, otherwise "False".
- **transfer_from_llm**: "True" if the jailbreak_query is transferred from LLM jailbreak attacks, otherwise "False".
RedTeam_2K will comprise the following columns:
- **id**: Unique identifier for all samples.
- **question**: Harmful query.
- **policy**: the safety policy that redteam_query against.
- **from**: The source of data.
## 🚀 Data Composition
<!-- Provide a quick summary of the dataset. -->
**RedTeam-2K**: RedTeam-2K dataset, a meticulously curated collection of 2, 000 harmful queries aimed at identifying alignment vulnerabilities within LLMs and MLLMs. This dataset spans across 16 safety policies and incorporates queries from 8 distinct sources.
<img src="distribution_used.png" width="800"/>
**JailBreakV_28K**: JailBreakV-28K contains 28, 000 jailbreak text-image pairs, which include 20, 000 text-based LLM transfer jailbreak attacks and 8, 000 image-based MLLM jailbreak attacks. This dataset covers 16 safety policies and 5 diverse jailbreak methods.
## 🛠️ Dataset Overview
The RedTeam-2K dataset, is a meticulously curated collection of 2, 000 harmful queries aimed at identifying alignment vulnerabilities within LLMs and MLLMs. This dataset spans 16 safety policies and incorporates queries from 8 distinct sources, including GPT Rewrite, Handcraft, GPT Generate, LLM Jailbreak Study, AdvBench, BeaverTails, Question Set, and hh-rlhf of Anthropic.
Building upon the harmful query dataset provided by RedTeam-2K, JailBreakV-28K is designed as a comprehensive and diversified benchmark for evaluating the transferability of jailbreak attacks from LLMs to MLLMs, as well as assessing the alignment robustness of MLLMs against such attacks. Specifically, JailBreakV-28K contains 28, 000 jailbreak text-image pairs, which include 20, 000 text-based LLM transfer jailbreak attacks and 8, 000 image-based MLLM jailbreak attacks. This dataset covers 16 safety policies and 5 diverse jailbreak methods. The jailbreak methods are formed by 3 types of LLM transfer attacks that include Logic (Cognitive Overload), Persuade (Persuasive Adversarial Prompts), and Template (including both of Greedy Coordinate Gradient and handcrafted strategies), and 2 types of MLLM attacks including FigStep and Query-relevant attack. The JailBreakV-28K offers a broad spectrum of attack methodologies and integrates various image types like Nature, Random Noise, Typography, Stable Diffusion (SD), Blank, and SD+Typography Images. We believe JailBreakV-28K can serve as a comprehensive jailbreak benchmark for MLLMs.
<img src="dataset.png" width="800"/>
## 🏆 Mini-Leaderboard
| Model | Total ASR| Transfer Attack ASR|
|----------------------------|:---------:|:------------------:|
| OmniLMM-12B | 58.1 | 70.2 |
| InfiMM-Zephyr-7B | 52.9 | 73.0 |
| LLaMA-Adapter-v2 | 51.2 | 68.1 |
| LLaVA-1.5-13B | 51.0 | 65.5 |
| LLaVA-1.5-7B | 46.8 | 61.4 |
| InstructBLIP-13B | 45.2 | 55.5 |
| InternLM-XComposer2-VL-7B | 39.1 | 29.3 |
| Bunny-v1 | 38.0 | 49.5 |
| Qwen-VL-Chat | 33.7 | 41.2 |
| InstructBLIP-7B | 26.0 | 46.8 |
## ❌ Disclaimers
This dataset contains offensive content that may be disturbing, This benchmark is provided for educational and research purposes only.
## 📲 Contact
- Weidi Luo: luo.1455@osu.edu
- Siyuan Ma: siyuan.ma.pku@gmail.com
- Xiaogeng Liu: xiaogeng.liu@wisc.edu
- Chaowei Xiao: cxiao34@wisc.edu
## 📖 BibTeX:
```python
@misc{luo2024jailbreakv28k,
title={JailBreakV-28K: A Benchmark for Assessing the Robustness of MultiModal Large Language Models against Jailbreak Attacks},
author={Weidi Luo and Siyuan Ma and Xiaogeng Liu and Xiaoyu Guo and Chaowei Xiao},
year={2024},
eprint={2404.03027},
archivePrefix={arXiv},
primaryClass={cs.CR}
}
```
[More Information Needed] |
BuroIdentidadDigital/forma_Migratoria | ---
license: c-uda
---
|
Amanaccessassist/sample | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 48442545.64163478
num_examples: 233294
- name: test
num_bytes: 5382597.358365225
num_examples: 25922
download_size: 20470912
dataset_size: 53825143.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-12000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 675507
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/mmarco_v2_it_dev | ---
pretty_name: '`mmarco/v2/it/dev`'
viewer: false
source_datasets: ['irds/mmarco_v2_it']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/it/dev`
The `mmarco/v2/it/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/it/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_v2_it`](https://huggingface.co/datasets/irds/mmarco_v2_it)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_v2_it_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_v2_it_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
JET2001/hf-imda-dataset-with-embeddings-150-speakers | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 9534589364
num_examples: 88090
download_size: 9529195163
dataset_size: 9534589364
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MartinDx/burgers | ---
license: mit
---
|
vigneshgs7/Boundary_detection_Doc_11 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 24078208445.0
num_examples: 484
download_size: 1587392343
dataset_size: 24078208445.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-astronomy-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 3383
num_examples: 5
download_size: 0
dataset_size: 3383
---
# Dataset Card for "mmlu-astronomy-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JeremiahZ/hendrycks_math_merged | ---
dataset_info:
features:
- name: problem
dtype: string
- name: level
dtype: string
- name: type
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 5984772
num_examples: 7500
- name: test
num_bytes: 3732833
num_examples: 5000
download_size: 4848009
dataset_size: 9717605
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# MATH Dataset Merged
This dataset is created from [EleutherAI/hendrycks_math](https://huggingface.co/datasets/EleutherAI/hendrycks_math) using this script
```python
import datasets
from datasets import DatasetDict
# dataset = datasets.load_dataset("EleutherAI/hendrycks_math")
configs = ['algebra', 'counting_and_probability', 'geometry', 'intermediate_algebra', 'number_theory', 'prealgebra', 'precalculus']
all_train_datasets = []
all_test_datasets = []
for c in configs:
c_dataset = datasets.load_dataset("EleutherAI/hendrycks_math", c)
all_train_datasets.append(c_dataset['train'])
all_test_datasets.append(c_dataset['test'])
# merge all datasets
merged_train_dataset = datasets.concatenate_datasets(all_train_datasets)
merged_test_dataset = datasets.concatenate_datasets(all_test_datasets)
merged_dataset = DatasetDict({'train': merged_train_dataset, 'test': merged_test_dataset})
merged_dataset
``` |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/efac3312 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1322
dataset_size: 176
---
# Dataset Card for "efac3312"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chaeso/food_chinese_2017 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '100'
'1': '101'
'2': '102'
'3': '103'
'4': '104'
'5': '105'
'6': '106'
'7': '107'
'8': '108'
'9': '109'
'10': '110'
'11': '111'
'12': '112'
'13': '113'
'14': '114'
'15': '115'
'16': '116'
'17': '117'
'18': '118'
'19': '119'
'20': '12'
'21': '120'
'22': '121'
'23': '122'
'24': '123'
'25': '124'
'26': '125'
'27': '126'
'28': '127'
'29': '128'
'30': '129'
'31': '13'
'32': '130'
'33': '131'
'34': '132'
'35': '133'
'36': '134'
'37': '135'
'38': '136'
'39': '137'
'40': '138'
'41': '139'
'42': '14'
'43': '140'
'44': '141'
'45': '142'
'46': '143'
'47': '144'
'48': '145'
'49': '146'
'50': '147'
'51': '148'
'52': '149'
'53': '15'
'54': '150'
'55': '151'
'56': '152'
'57': '153'
'58': '154'
'59': '155'
'60': '156'
'61': '157'
'62': '158'
'63': '159'
'64': '16'
'65': '160'
'66': '161'
'67': '162'
'68': '163'
'69': '164'
'70': '165'
'71': '166'
'72': '167'
'73': '168'
'74': '169'
'75': '17'
'76': '170'
'77': '171'
'78': '172'
'79': '173'
'80': '174'
'81': '175'
'82': '176'
'83': '177'
'84': '178'
'85': '179'
'86': '18'
'87': '180'
'88': '181'
'89': '182'
'90': '183'
'91': '184'
'92': '185'
'93': '186'
'94': '187'
'95': '188'
'96': '189'
'97': '19'
'98': '190'
'99': '191'
'100': '192'
'101': '193'
'102': '194'
'103': '195'
'104': '196'
'105': '197'
'106': '198'
'107': '199'
'108': '20'
'109': '200'
'110': '201'
'111': '202'
'112': '203'
'113': '204'
'114': '205'
'115': '206'
'116': '207'
'117': '208'
'118': '209'
'119': '21'
'120': '210'
'121': '211'
'122': '212'
'123': '213'
'124': '214'
'125': '215'
'126': '216'
'127': '217'
'128': '218'
'129': '219'
'130': '22'
'131': '220'
'132': '221'
'133': '222'
'134': '223'
'135': '224'
'136': '225'
'137': '226'
'138': '227'
'139': '228'
'140': '229'
'141': '23'
'142': '230'
'143': '231'
'144': '232'
'145': '233'
'146': '234'
'147': '235'
'148': '236'
'149': '237'
'150': '238'
'151': '239'
'152': '24'
'153': '240'
'154': '241'
'155': '242'
'156': '243'
'157': '244'
'158': '245'
'159': '246'
'160': '247'
'161': '248'
'162': '249'
'163': '25'
'164': '250'
'165': '251'
'166': '252'
'167': '253'
'168': '254'
'169': '255'
'170': '256'
'171': '26'
'172': '27'
'173': '28'
'174': '29'
'175': '3'
'176': '30'
'177': '31'
'178': '32'
'179': '33'
'180': '34'
'181': '35'
'182': '36'
'183': '37'
'184': '38'
'185': '39'
'186': '4'
'187': '40'
'188': '41'
'189': '42'
'190': '43'
'191': '44'
'192': '45'
'193': '46'
'194': '47'
'195': '48'
'196': '49'
'197': '50'
'198': '51'
'199': '52'
'200': '53'
'201': '54'
'202': '55'
'203': '56'
'204': '57'
'205': '58'
'206': '59'
'207': '60'
'208': '61'
'209': '62'
'210': '63'
'211': '64'
'212': '65'
'213': '66'
'214': '67'
'215': '68'
'216': '69'
'217': '70'
'218': '71'
'219': '72'
'220': '73'
'221': '74'
'222': '75'
'223': '76'
'224': '77'
'225': '78'
'226': '79'
'227': '8'
'228': '80'
'229': '81'
'230': '82'
'231': '83'
'232': '84'
'233': '85'
'234': '86'
'235': '87'
'236': '88'
'237': '89'
'238': '9'
'239': '90'
'240': '91'
'241': '92'
'242': '93'
'243': '94'
'244': '95'
'245': '96'
'246': '97'
'247': '98'
'248': '99'
'249': beef_currie
'250': bibimbob
'251': donburi
'252': grilled_eel
'253': rice
'254': sushi
'255': tendong
splits:
- name: train
num_bytes: 408076826.985
num_examples: 31395
- name: test
num_bytes: 135802193.08
num_examples: 6660
- name: validation
num_bytes: 137529971.372
num_examples: 6734
download_size: 677961805
dataset_size: 681408991.437
---
# Dataset Card for "food_chinese_2017"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
timm/resisc45 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': airport
'2': baseball_diamond
'3': basketball_court
'4': beach
'5': bridge
'6': chaparral
'7': church
'8': circular_farmland
'9': cloud
'10': commercial_area
'11': dense_residential
'12': desert
'13': forest
'14': freeway
'15': golf_course
'16': ground_track_field
'17': harbor
'18': industrial_area
'19': intersection
'20': island
'21': lake
'22': meadow
'23': medium_residential
'24': mobile_home_park
'25': mountain
'26': overpass
'27': palace
'28': parking_lot
'29': railway
'30': railway_station
'31': rectangular_farmland
'32': river
'33': roundabout
'34': runway
'35': sea_ice
'36': ship
'37': snowberg
'38': sparse_residential
'39': stadium
'40': storage_tank
'41': tennis_court
'42': terrace
'43': thermal_power_station
'44': wetland
- name: image_id
dtype: string
splits:
- name: train
num_bytes: 254594749.8
num_examples: 18900
- name: validation
num_bytes: 84784207.3
num_examples: 6300
- name: test
num_bytes: 85237234
num_examples: 6300
download_size: 425667137
dataset_size: 424616191.1
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
size_categories:
- 10K<n<100K
license: unknown
task_categories:
- image-classification
---
## Description
RESISC45 dataset is a publicly available benchmark for Remote Sensing Image Scene Classification (RESISC), created by Northwestern Polytechnical University (NWPU). This dataset contains 31,500 images, covering 45 scene classes with 700 images in each class.
The dataset does not have any default splits. Train, validation, and test splits were based on these definitions here https://github.com/google-research/google-research/blob/master/remote_sensing_representations/README.md#dataset-splits
- Paper: https://arxiv.org/abs/1703.00121.
- Website: https://paperswithcode.com/dataset/resisc45 (original homepage is unresponsive http://www.escience.cn/people/JunweiHan/NWPU-RESISC45.html)
## Citation
```bibtex
@article{Cheng_2017,
title={Remote Sensing Image Scene Classification: Benchmark and State of the Art},
volume={105},
ISSN={1558-2256},
url={http://dx.doi.org/10.1109/JPROC.2017.2675998},
DOI={10.1109/jproc.2017.2675998},
number={10},
journal={Proceedings of the IEEE},
publisher={Institute of Electrical and Electronics Engineers (IEEE)},
author={Cheng, Gong and Han, Junwei and Lu, Xiaoqiang},
year={2017},
month={Oct},
pages={1865-1883}
}
``` |
McGill-NLP/WebLINX-full | ---
language:
- en
size_categories:
- 10K<n<100K
config_names:
- chat
configs:
- config_name: chat
default: true
data_files:
- split: train
path: chat/train.csv
- split: validation
path: chat/valid.csv
- split: test
path: chat/test_iid.csv
- split: test_geo
path: chat/test_geo.csv
- split: test_vis
path: chat/test_vis.csv
- split: test_cat
path: chat/test_cat.csv
- split: test_web
path: chat/test_web.csv
tags:
- conversational
- image-to-text
- vision
- convAI
---
# WebLINX: Real-World Website Navigation with Multi-Turn Dialogue
WARNING: This is not the main WebLINX data card! You might want to use the main WebLINX data card instead:
> **[WebLINX: Real-World Website Navigation with Multi-Turn Dialogue](https://huggingface.co/datasets/mcgill-nlp/weblinx)** |
text2log | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: text2log
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
task_ids: []
dataset_info:
features:
- name: sentence
dtype: string
- name: fol_translation
dtype: string
splits:
- name: train
num_bytes: 10358134
num_examples: 101931
download_size: 9746473
dataset_size: 10358134
---
# Dataset Card for text2log
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:**
- **Repository:** [GitHub](https://github.com/alevkov/text2log)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://github.com/alevkov
### Dataset Summary
The dataset contains 100,000 simple English sentences selected and filtered from `enTenTen15` and their translation into First Order Logic (FOL) using `ccg2lambda`.
### Supported Tasks and Leaderboards
'semantic-parsing': The data set is used to train models which can generate FOL statements from natural language text
### Languages
en-US
## Dataset Structure
### Data Instances
```
{
'clean':'All things that are new are good.',
'trans':'all x1.(_thing(x1) -> (_new(x1) -> _good(x1)))'
}
```
### Data Fields
- 'clean': a simple English sentence
- 'trans': the corresponding translation into Lambda Dependency-based Compositional Semantics
### Data Splits
No predefined train/test split is given. The authors used a 80/20 split
## Dataset Creation
### Curation Rationale
The text2log data set is used to improve FOL statement generation from natural text
### Source Data
#### Initial Data Collection and Normalization
Short text samples selected from enTenTen15
#### Who are the source language producers?
See https://www.sketchengine.eu/ententen-english-corpus/
### Annotations
#### Annotation process
Machine generated using https://github.com/mynlp/ccg2lambda
#### Who are the annotators?
none
### Personal and Sensitive Information
The dataset does not contain personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
None given
### Citation Information
```bibtex
@INPROCEEDINGS{9401852,
author={Levkovskyi, Oleksii and Li, Wei},
booktitle={SoutheastCon 2021},
title={Generating Predicate Logic Expressions from Natural Language},
year={2021},
volume={},
number={},
pages={1-8},
doi={10.1109/SoutheastCon45413.2021.9401852}
}
```
### Contributions
Thanks to [@apergo-ai](https://github.com/apergo-ai) for adding this dataset. |
DataStudio/TTS_02 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: content
dtype: string
splits:
- name: train
num_bytes: 13854838890.994
num_examples: 75523
download_size: 10957004730
dataset_size: 13854838890.994
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TTS_02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_40 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1281646216
num_examples: 251698
download_size: 1299505185
dataset_size: 1281646216
---
# Dataset Card for "chunk_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/eval_tag_squad_v8 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 13020105
num_examples: 10570
- name: validation
num_bytes: 13020105
num_examples: 10570
download_size: 5664930
dataset_size: 26040210
---
# Dataset Card for "eval_tag_squad_v8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B | ---
pretty_name: Evaluation run of KnutJaegersberg/Deacon-20B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Deacon-20B](https://huggingface.co/KnutJaegersberg/Deacon-20B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T09:05:17.184238](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B/blob/main/results_2024-01-05T09-05-17.184238.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6041366068244398,\n\
\ \"acc_stderr\": 0.032898915535709075,\n \"acc_norm\": 0.6106134294937929,\n\
\ \"acc_norm_stderr\": 0.033580635198863264,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5848788971105185,\n\
\ \"mc2_stderr\": 0.01542200303332033\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.014484703048857362,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6254730133439554,\n\
\ \"acc_stderr\": 0.004830113797327048,\n \"acc_norm\": 0.8173670583549094,\n\
\ \"acc_norm_stderr\": 0.0038557568514415433\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365252,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365252\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764812,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764812\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198913,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198913\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457964,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457964\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2245810055865922,\n\
\ \"acc_stderr\": 0.013956803666544641,\n \"acc_norm\": 0.2245810055865922,\n\
\ \"acc_norm_stderr\": 0.013956803666544641\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291477,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922442,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922442\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411127,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411127\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328903,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328903\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.0337997668989631,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.0337997668989631\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5848788971105185,\n\
\ \"mc2_stderr\": 0.01542200303332033\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827933\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2918877937831691,\n \
\ \"acc_stderr\": 0.012522795894420869\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Deacon-20B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|arc:challenge|25_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|gsm8k|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hellaswag|10_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T09-05-17.184238.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- '**/details_harness|winogrande|5_2024-01-05T09-05-17.184238.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T09-05-17.184238.parquet'
- config_name: results
data_files:
- split: 2024_01_05T09_05_17.184238
path:
- results_2024-01-05T09-05-17.184238.parquet
- split: latest
path:
- results_2024-01-05T09-05-17.184238.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-20B](https://huggingface.co/KnutJaegersberg/Deacon-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:05:17.184238](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B/blob/main/results_2024-01-05T09-05-17.184238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6041366068244398,
"acc_stderr": 0.032898915535709075,
"acc_norm": 0.6106134294937929,
"acc_norm_stderr": 0.033580635198863264,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5848788971105185,
"mc2_stderr": 0.01542200303332033
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.014484703048857362,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670728
},
"harness|hellaswag|10": {
"acc": 0.6254730133439554,
"acc_stderr": 0.004830113797327048,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415433
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365252,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365252
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457964,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457964
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2245810055865922,
"acc_stderr": 0.013956803666544641,
"acc_norm": 0.2245810055865922,
"acc_norm_stderr": 0.013956803666544641
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922442,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411127,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411127
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328903,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328903
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.0337997668989631,
"acc_norm": 0.87,
"acc_norm_stderr": 0.0337997668989631
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5848788971105185,
"mc2_stderr": 0.01542200303332033
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827933
},
"harness|gsm8k|5": {
"acc": 0.2918877937831691,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_DreadPoor__Chimera-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/Chimera-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/Chimera-7B-slerp](https://huggingface.co/DreadPoor/Chimera-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__Chimera-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T22:26:43.244149](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Chimera-7B-slerp/blob/main/results_2024-04-08T22-26-43.244149.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530256896006031,\n\
\ \"acc_stderr\": 0.032000934663522176,\n \"acc_norm\": 0.6530569014798114,\n\
\ \"acc_norm_stderr\": 0.03265902654314118,\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.6443096859937673,\n\
\ \"mc2_stderr\": 0.015348310496933864\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n\
\ \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.01336308010724448\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7003584943238399,\n\
\ \"acc_stderr\": 0.004571647137441118,\n \"acc_norm\": 0.8709420434176459,\n\
\ \"acc_norm_stderr\": 0.0033457889052629477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908352,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908352\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.01322392861674162,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.01322392861674162\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580435,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580435\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.6443096859937673,\n\
\ \"mc2_stderr\": 0.015348310496933864\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068677\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \
\ \"acc_stderr\": 0.012799353675801832\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/Chimera-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-26-43.244149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-26-43.244149.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- '**/details_harness|winogrande|5_2024-04-08T22-26-43.244149.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T22-26-43.244149.parquet'
- config_name: results
data_files:
- split: 2024_04_08T22_26_43.244149
path:
- results_2024-04-08T22-26-43.244149.parquet
- split: latest
path:
- results_2024-04-08T22-26-43.244149.parquet
---
# Dataset Card for Evaluation run of DreadPoor/Chimera-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/Chimera-7B-slerp](https://huggingface.co/DreadPoor/Chimera-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__Chimera-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T22:26:43.244149](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__Chimera-7B-slerp/blob/main/results_2024-04-08T22-26-43.244149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530256896006031,
"acc_stderr": 0.032000934663522176,
"acc_norm": 0.6530569014798114,
"acc_norm_stderr": 0.03265902654314118,
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.6443096859937673,
"mc2_stderr": 0.015348310496933864
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.01371584794071934,
"acc_norm": 0.7022184300341296,
"acc_norm_stderr": 0.01336308010724448
},
"harness|hellaswag|10": {
"acc": 0.7003584943238399,
"acc_stderr": 0.004571647137441118,
"acc_norm": 0.8709420434176459,
"acc_norm_stderr": 0.0033457889052629477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.01322392861674162,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.01322392861674162
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580435,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580435
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070813,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070813
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.6443096859937673,
"mc2_stderr": 0.015348310496933864
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068677
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gzguevara/test2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: image
dtype: image
- name: mask_0
dtype: image
- name: mask_1
dtype: image
- name: mask_2
dtype: image
- name: mask_3
dtype: image
- name: mask_4
dtype: image
splits:
- name: train
num_bytes: 6082099.0
num_examples: 11
- name: test
num_bytes: 1944663.0
num_examples: 4
download_size: 8125511
dataset_size: 8026762.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
as-cle-bert/breastcanc-ultrasound-class | ---
license: cc
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign_breast_cancer
'1': malignant_breast_cancer
splits:
- name: train
num_bytes: 215979329.0
num_examples: 647
download_size: 215636753
dataset_size: 215979329.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# breastcanc-ultrasound-class
## Background
Cancer is the second leading cause of death worldwide, according to _IHME - Global Burden of Disease_, with 10.7 mln casualties in 2019.

Amongst the various types of cancer, a huge role is played by breast cancer, which stands in 4th position among the deadliest tumors, with more than 700.000 deaths during 2019 (_IHME - Global Burden of Disease_).

Moreover, breast cancer has the highest share of number of cases/100 people worldwide (0.23 cases/100 people; _IHME - Global Burden of Disease_), as shown in [table1](#table-1)
:
### Table 1
| Type of Cancer | Cases per 100 people |
|--------------------------------------|-------------------------------------------|
| Breast Cancer | 0.23% |
| Colon and Rectum Cancer | 0.14 |
| Prostate cancer | 0.13 |
| Bladder Cancer | 0.034 |
| Stomach Cancer | 0.033 |
In this sense, it is more than vital to put intense effort into precision medicine and diagnostic tools for what concerns breast cancer: part of this effort should involve making curated dataset of diagnostic images available to the large public.
## Dataset source and composition
This dataset is part of the one kindly provided by [Walid Al-Dhabyani and collaborators in 2019](https://doi.org/10.1016/j.dib.2019.104863), and encompasses 647 images, 437 representing benign breast cancer ultrasound images and 210 representing malignant breast cancer ultrasound images.
## License, references and citation
The dataset is hereby provided under CC family licenses.
Please cite Al-Dhabyani W, Gomaa M, Khaled H, Fahmy A. Dataset of breast ultrasound images. Data in Brief. 2020 Feb;28:104863. DOI: 10.1016/j.dib.2019.104863 when using it.
|
gigant/tib-wip-filtered | ---
dataset_info:
features:
- name: doi
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: video_url
dtype: string
- name: license
dtype: string
- name: subject
dtype: string
- name: genre
dtype: string
- name: release_year
dtype: string
- name: author
dtype: string
- name: contributors
dtype: string
- name: abstract
dtype: string
- name: transcript
dtype: string
- name: transcript_segments
sequence:
- name: id
dtype: int32
- name: seek
dtype: int32
- name: start
dtype: float32
- name: end
dtype: float32
- name: text
dtype: string
- name: tokens
sequence: int32
- name: temperature
dtype: float32
- name: avg_logprob
dtype: float32
- name: compression_ratio
dtype: float32
- name: no_speech_prob
dtype: float32
- name: keyframes
sequence:
- name: slide
dtype: string
- name: frames
sequence: int32
- name: timestamp
sequence: float32
- name: language
dtype: string
splits:
- name: train
num_bytes: 1062896143.5539255
num_examples: 9294
download_size: 511200645
dataset_size: 1062896143.5539255
---
# Dataset Card for "tib-wip-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thanhduycao/oscar_vi_shard_0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 8382084494.112768
num_examples: 2474428
download_size: 4368998322
dataset_size: 8382084494.112768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oscar_vi_shard_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jinaai/miracl-es | ---
license: apache-2.0
---
## MIRACL Dataset
This dataset is a reformatted version of the original [MIRACL dataset](https://huggingface.co/datasets/miracl/miracl),
into the format expected for MTEB reranking tasks, limiting the language to Spanish only. |
disi-unibo-nlp/medqa-5-opt-MedGENIE | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: target
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: text
dtype: string
splits:
- name: train
num_bytes: 77044736
num_examples: 10178
- name: validation
num_bytes: 9662825
num_examples: 1272
- name: test
num_bytes: 9719509
num_examples: 1273
download_size: 5761417
dataset_size: 96427070
license: mit
task_categories:
- question-answering
language:
- en
tags:
- medical
---
# Dataset Card for "medqa-5-opt-MedGENIE"
## Dataset Description
The data is a part of the MedGENIE collection of medical datasets augmented with artificial contexts generated by [PMC-LLaMA-13B](https://huggingface.co/axiong/PMC_LLaMA_13B). Specifically, up to 5 artificial contexts were generated for each question in [MedQA-USMLE](https://github.com/jind11/MedQA) (5 options), employing a multi-view approach to encompass various perspectives associated with the given question.
The dataset has been used to train [MedGENIE-fid-flan-t5-base-medqa](https://huggingface.co/disi-unibo-nlp/MedGENIE-fid-flan-t5-base-medqa) allowing it to reach a new state-of-the-art on the MedQA-USMLE test set.
## Dataset Structure
The dataset has three splits, suitable for:
* Training *question-answering* models, including *fusion-in-decoder* architectures.
* Augmenting your LLMs during inference with generated contexts rather than retrived chunks.
* Augmening your knolwedge base of factual documents with generated contexts for standard RAG pipeline.
The number of examples per split is:
- **train:** 10178 samples
- **validation:** 1273 samples
- **test:** 1273 samples
The dataset is stored in parquet format with each entry using the following schema:
```
{
"id": 0,
"question": "A 23-year-old pregnant woman at 22 weeks gestation presents with burning upon urination. She states it started 1 day ago and has been worsening despite drinking more water and taking cranberry extract. She otherwise feels well and is followed by a doctor for her pregnancy. Her temperature is 97.7\u00b0F (36.5\u00b0C), blood pressure is 122/77 mmHg, pulse is 80/min, respirations are 19/min, and oxygen saturation is 98% on room air. Physical exam is notable for an absence of costovertebral angle tenderness and a gravid uterus. Which of the following is the best treatment for this patient?\nA. Ampicillin\nB. Ceftriaxone\nC. Ciprofloxacin\nD. Doxycycline\nE. Nitrofurantoin",
"target": "E",
"answers": [
"E"
],
"ctxs": [
{
"text": "This is a case of uncomplicated cystitis, which is frequently seen in pregnancy. Symptoms include urinary frequency,..."
},
{
"text": "The burning upon urination in a pregnant female is often due to asymptomatic bacteriuria that results in a urinary tract..."
},
{
"text": "The patient's symptoms are consistent with a lower urinary tract infection. An accurate history and physical exam exclude the..."
},
{
"text": "Asymptomatic bacteriuria is a frequent finding in pregnancy. Treatment is not recommended unless there are signs of an upper urinary..."
},
{
"text": "Asymptomatic bacteriuria is present if a patient has persistent (>2 weeks) bacteria in the urine as documented by a positive urine..."
}
]
}
```
## Augmenting LLMs during inference
Augmenting *state-of-the-art* LLMs with generated contexts from **medqa-5-opt-MedGENIE** demonstrated a remarkable performance boost. For a given question, all relevant contexts are concatenated and passed within the context window of the LLM.
| Model | Learning|medqa-5-opt-MedGENIE |Accuracy |
|------|------|-----|-----|
| LLaMA-2-chat (7B)|2-shot | NO|29.2 |
| LLaMA-2-chat (7B)| 2-shot|YES |47.1 **(+ 17.9)** |
| Zephyr-β (7B)|2-shot|NO | 43.1 |
| Zephyr-β (7B)|2-shot| YES |54.9 **(+ 11.8)** |
## Evaluation for RAG
To assess the effectiveness of using our generated contexts for RAG pipeline, we augment the [MedWiki](https://huggingface.co/datasets/VOD-LM/medwiki) dataset with a smaller portion of artificially generated chunks derived from train and test sets of **medqa-5-opt-MedGENIE** and [medmcqa-MedGENIE](https://huggingface.co/datasets/disi-unibo-nlp/medmcqa-MedGENIE).
| MedWiki chunks | Artificial chunks | Rerank | LLaMA-2-chat (7B) | mistral-instruct (7B) | Zephyr-β (7B) |
|------|-----|----------------|-------------------|-----------------------|---------------------|
| 4.5M | - | NO | 32.2 | 36.8 | 44.7 |
| 4.5M | 96K (only test) | NO | 35.8 **(+ 3.5)** | 37.9 **(+ 1.1)** | 47.5 **(+2.8)** |
| 4.5M | 2M (train + test)| NO | 36.3 **(+ 4.1)** | 37.9 **(+ 1.1)** | 47.8 **(+3.1)** |
| 4.5M | - | YES | 32.8 | 35.1 | 44.0 |
| 4.5M | 96K (only test)| YES | 36.5 **(+3.7)** | 37.6 **(+2.5)** | 47.8 **(+2.8)** |
| 4.5M | 2M (train + test)| YES | 33.5 **(+0.8)** | 37.2 **(+2.1)** | 47.9 **(+3.9)** |
## Citation
If you find this dataset is useful in your work, please cite it with:
```
@misc{frisoni2024generate,
title={To Generate or to Retrieve? On the Effectiveness of Artificial Contexts for Medical Open-Domain Question Answering},
author={Giacomo Frisoni and Alessio Cocchieri and Alex Presepi and Gianluca Moro and Zaiqiao Meng},
year={2024},
eprint={2403.01924},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
adalib/colossalai-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 1898043
num_examples: 201
- name: test
num_bytes: 510789
num_examples: 60
download_size: 755695
dataset_size: 2408832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Erynan/shuffled_formatted_util_deontology_for_llama2_v2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26907365
num_examples: 30471
download_size: 4806414
dataset_size: 26907365
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jtmichels/data1 | ---
license: creativeml-openrail-m
---
|
juliensimon/amazon-shoe-reviews | ---
language: en
dataset_info:
features:
- name: labels
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 16847665.2
num_examples: 90000
- name: test
num_bytes: 1871962.8
num_examples: 10000
download_size: 0
dataset_size: 18719628.0
---
# Dataset Card for "amazon-shoe-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aazwan/carigold_general_chat_text_dataset | ---
language:
- ms
- en
pretty_name: malay
---
Text data from Carigold forum replies based on General Chat section (https://carigold.com/forum/forums/general-chat.174/)
Language = Malay + English mixed |
liuyanchen1015/MULTI_VALUE_cola_is_am_1s | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2160
num_examples: 28
- name: test
num_bytes: 2099
num_examples: 26
- name: train
num_bytes: 18346
num_examples: 244
download_size: 16951
dataset_size: 22605
---
# Dataset Card for "MULTI_VALUE_cola_is_am_1s"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/LosslessMegaCodeTrainingV2-1m-Evol-Uncensored-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 1508409661
num_examples: 1884414
download_size: 839218210
dataset_size: 1508409661
---
# Dataset Card for "LosslessMegaCodeTrainingV2-1m-Evol-Uncensored-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reaganjlee/boolq_pt | ---
dataset_info:
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: answer
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: validation
num_bytes: 1604091
num_examples: 3270
- name: train
num_bytes: 4624752
num_examples: 9427
download_size: 3843346
dataset_size: 6228843
---
# Dataset Card for "boolq_pt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Anusha64/BHA1 | ---
license: mit
---
|
society-ethics/BlogPostOpenness | ---
license: cc-by-4.0
---
## Mission: Open and Good ML
In our mission to democratize good machine learning (ML), we examine how supporting ML community work also empowers examining and preventing possible harms. Open development and science decentralizes power so that many people can collectively work on AI that reflects their needs and values. While [openness enables broader perspectives to contribute to research and AI overall, it faces the tension of less risk control](https://arxiv.org/abs/2302.04844).
Moderating ML artifacts presents unique challenges due to the dynamic and rapidly evolving nature of these systems. In fact, as ML models become more advanced and capable of producing increasingly diverse content, the potential for harmful or unintended outputs grows, necessitating the development of robust moderation and evaluation strategies. Moreover, the complexity of ML models and the vast amounts of data they process exacerbate the challenge of identifying and addressing potential biases and ethical concerns.
As hosts, we recognize the responsibility that comes with potentially amplifying harm to our users and the world more broadly. Often these harms disparately impact minority communities in a context-dependent manner. We have taken the approach of analyzing the tensions in play for each context, open to discussion across the company and Hugging Face community. While many models can amplify harm, especially discriminatory content, we are taking a series of steps to identify highest risk models and what action to take. Importantly, active perspectives from many backgrounds is key to understanding, measuring, and mitigating potential harms that affect different groups of people.
We are crafting tools and safeguards in addition to improving our documentation practices to ensure open source science empowers individuals and continues to minimize potential harms.
## Ethical Categories
The first major aspect of our work to foster good open ML consists in promoting the tools and positive examples of ML development that prioritize values and consideration for its stakeholders. This helps users take concrete steps to address outstanding issues, and present plausible alternatives to de facto damaging practices in ML development.
To help our users discover and engage with ethics-related ML work, we have compiled a set of tags. These 6 high-level categories are based on our analysis of Spaces that community members had contributed. They are designed to give you a jargon-free way of thinking about ethical technology:
- Rigorous work pays special attention to developing with best practices in mind. In ML, this can mean examining failure cases (including conducting bias and fairness audits), protecting privacy through security measures, and ensuring that potential users (technical and non-technical) are informed about the project's limitations.
- Consentful work [supports](https://www.consentfultech.io/) the self-determination of people who use and are affected by these technologies.
- Socially Conscious work shows us how technology can support social, environmental, and scientific efforts.
- Sustainable work highlights and explores techniques for making machine learning ecologically sustainable.
- Inclusive work broadens the scope of who builds and benefits in the machine learning world.
- Inquisitive work shines a light on inequities and power structures which challenge the community to rethink its relationship to technology.
Read more at https://huggingface.co/ethics
Look for these terms as we’ll be using these tags, and updating them based on community contributions, across some new projects on the Hub!
## Safeguards
Taking an “all-or-nothing” view of open releases ignores the wide variety of contexts that determine an ML artifact’s positive or negative impacts. Having more levers of control over how ML systems are shared and re-used supports collaborative development and analysis with less risk of promoting harmful uses or misuses; allowing for more openness and participation in innovation for shared benefits.
We engage directly with contributors and have addressed pressing issues. To bring this to the next level, we are building community-based processes. This approach empowers both Hugging Face contributors, and those affected by contributions, to inform the limitations, sharing, and additional mechanisms necessary for models and data made available on our platform. The three main aspects we will pay attention to are: the origin of the artifact, how the artifact is handled by its developers, and how the artifact has been used. In that respect we:
- launched a [flagging feature](https://twitter.com/GiadaPistilli/status/1571865167092396033) for our community to determine whether ML artifacts or community content (model, dataset, space, or discussion) violate our [content guidelines](https://huggingface.co/content-guidelines),
- monitor our community discussion boards to ensure Hub users abide by the [code of conduct](https://huggingface.co/code-of-conduct),
- robustly document our most-downloaded models with model cards that detail social impacts, biases, and intended and out-of-scope use cases,
- create audience-guiding tags, such as the “Not For All Audiences” tag that can be added to the repository’s card metadata to avoid un-requested violent and sexual content,
- promote use of [Open Responsible AI Licenses (RAIL)](https://huggingface.co/blog/open_rail) for [models](https://www.licenses.ai/blog/2022/8/26/bigscience-open-rail-m-license), such as with LLMs ([BLOOM](https://huggingface.co/spaces/bigscience/license), [BigCode](https://huggingface.co/spaces/bigcode/license)),
- conduct research that [analyzes](https://arxiv.org/abs/2302.04844) which models and datasets have the highest potential for, or track record of, misuse and malicious use.
**How to use the flagging function:**
Click on the flag icon on any Model, Dataset, Space, or Discussion:
<p align="center">
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/ethics_soc_3/flag2.jpg" alt="screenshot pointing to the flag icon to Report this model" />
</p>
Share why you flagged this item:
<p align="center">
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/ethics_soc_3/flag1.jpg" alt="screenshot showing the text window where you describe why you flagged this item" />
</p>
In prioritizing open science, we examine potential harm on a case-by-case basis. When users flag a system, developers can directly and transparently respond to concerns. Moderators are able to disengage from discussions should behavior become hateful and/or abusive (see [code of conduct](https://huggingface.co/code-of-conduct)).
Should a specific model be flagged as high risk by our community, we consider:
- Downgrading the ML artifact’s visibility across the Hub in the trending tab and in feeds,
- Requesting that the models be made private,
- Gating access to ML artifacts (see documentation for [models](https://huggingface.co/docs/hub/models-gated) and [datasets](https://huggingface.co/docs/hub/datasets-gated)),
- Disabling access.
**How to add the “Not For All Audiences” tag:**
Edit the model/data card → add `not-for-all-audiences` in the tags section → open the PR and wait for the authors to merge it. Once merged, the following tag will be displayed on the repository:
<p align="center">
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/ethics_soc_3/nfaa_tag.png" alt="screenshot showing where to add tags" />
</p>
Any repository tagged `not-for-all-audiences` will display the following popup when visited:
<p align="center">
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/ethics_soc_3/nfaa2.png" alt="screenshot showing where to add tags" />
</p>
Clicking "View Content" will allow you to view the repository as normal. If you wish to always view `not-for-all-audiences`-tagged repositories without the popup, this setting can be changed in a user's [Content Preferences](https://huggingface.co/settings/content-preferences)
<p align="center">
<br>
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/ethics_soc_3/nfaa1.png" alt="screenshot showing where to add tags" />
</p>
Open science requires safeguards, and one of our goals is to create an environment informed by tradeoffs with different values. Hosting and providing access to models in addition to cultivating community and discussion empowers diverse groups to assess social implications and guide what is good machine learning.
## Are you working on safeguards? Share them on Hugging Face Hub!
The most important part of Hugging Face is our community. If you’re a researcher working on making ML safer to use, especially for open science, we want to support and showcase your work!
Here are some recent demos and tools from researchers in the Hugging Face community:
- [A Watermark for LLMs](https://huggingface.co/spaces/tomg-group-umd/lm-watermarking) by John Kirchenbauer, Jonas Geiping, Yuxin Wen, Jonathan Katz, Ian Miers, Tom Goldstein ([paper](https://arxiv.org/abs/2301.10226))
- [Generate Model Cards Tool](https://huggingface.co/spaces/huggingface/Model_Cards_Writing_Tool) by the Hugging Face team
- [Photoguard](https://huggingface.co/spaces/RamAnanth1/photoguard) to safeguard images against manipulation by Ram Ananth
Thanks for reading! 🤗
~ Irene, Nima, Giada, Yacine, and Elizabeth, on behalf of the Ethics and Society regulars
If you want to cite this blog post, please use the following:
```
@misc{hf_ethics_soc_blog_3,
author = {Irene Solaiman and
Giada Pistilli and
Nima Boscarino and
Yacine Jernite and
Elizabeth Allendorf and
Margaret Mitchell and
Carlos Muñoz Ferrandis and
Nathan Lambert and
Alexandra Sasha Luccioni
},
title = {Hugging Face Ethics and Society Newsletter 3: Ethical Openness at Hugging Face},
booktitle = {Hugging Face Blog},
year = {2023},
url = {https://doi.org/10.57967/hf/0487},
doi = {10.57967/hf/0487}
}
``` |
patruff/chucklesMistralEFT3 | ---
dataset_info:
features:
- name: text
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 654902
num_examples: 672
- name: test
num_bytes: 162814
num_examples: 168
download_size: 97425
dataset_size: 817716
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-596cbd-1668659070 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-1.3b
metrics: ['f1', 'perplexity']
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-1.3b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ddcas](https://huggingface.co/ddcas) for evaluating this model. |
MallardDuck/scryfall-dataset | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: oracle_id
dtype: string
- name: name
dtype: string
- name: released_at
dtype: string
- name: uri
dtype: string
- name: mana_cost
dtype: string
- name: cmc
dtype: float64
- name: type_line
dtype: string
- name: oracle_text
dtype: string
- name: power
dtype: string
- name: toughness
dtype: string
- name: loyalty
dtype: string
- name: colors
sequence: string
- name: color_identity
sequence: string
- name: keywords
sequence: string
- name: edhrec_rank
dtype: float64
splits:
- name: train
num_bytes: 13145409
num_examples: 30845
- name: train_clean
num_bytes: 11096387
num_examples: 25438
download_size: 12595090
dataset_size: 24241796
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_clean
path: data/train_clean-*
---
|
open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5 | ---
pretty_name: Evaluation run of migtissera/SynthIA-70B-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/SynthIA-70B-v1.5](https://huggingface.co/migtissera/SynthIA-70B-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T03:41:45.134305](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5/blob/main/results_2024-01-08T03-41-45.134305.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6908264640585592,\n\
\ \"acc_stderr\": 0.03059066166329622,\n \"acc_norm\": 0.6946757184916225,\n\
\ \"acc_norm_stderr\": 0.031189654753338784,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5740125787335826,\n\
\ \"mc2_stderr\": 0.015069171807224776\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.013921008595179349,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6813383788090022,\n\
\ \"acc_stderr\": 0.0046500521500943935,\n \"acc_norm\": 0.8697470623381797,\n\
\ \"acc_norm_stderr\": 0.00335893627986726\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.03279000406310051,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.03279000406310051\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853106,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7307692307692307,\n \"acc_stderr\": 0.022489389793654817,\n\
\ \"acc_norm\": 0.7307692307692307,\n \"acc_norm_stderr\": 0.022489389793654817\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8990825688073395,\n \"acc_stderr\": 0.012914673545364432,\n \"\
acc_norm\": 0.8990825688073395,\n \"acc_norm_stderr\": 0.012914673545364432\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"\
acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941646,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941646\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967558,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967558\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5743016759776536,\n\
\ \"acc_stderr\": 0.016536829648997116,\n \"acc_norm\": 0.5743016759776536,\n\
\ \"acc_norm_stderr\": 0.016536829648997116\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396147,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396147\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02576725201085596,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02576725201085596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427646,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705382,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705382\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.02540930195322568,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.02540930195322568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.5740125787335826,\n\
\ \"mc2_stderr\": 0.015069171807224776\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \
\ \"acc_stderr\": 0.01370849499567764\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/SynthIA-70B-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-41-45.134305.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- '**/details_harness|winogrande|5_2024-01-08T03-41-45.134305.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T03-41-45.134305.parquet'
- config_name: results
data_files:
- split: 2024_01_08T03_41_45.134305
path:
- results_2024-01-08T03-41-45.134305.parquet
- split: latest
path:
- results_2024-01-08T03-41-45.134305.parquet
---
# Dataset Card for Evaluation run of migtissera/SynthIA-70B-v1.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/SynthIA-70B-v1.5](https://huggingface.co/migtissera/SynthIA-70B-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:41:45.134305](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5/blob/main/results_2024-01-08T03-41-45.134305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6908264640585592,
"acc_stderr": 0.03059066166329622,
"acc_norm": 0.6946757184916225,
"acc_norm_stderr": 0.031189654753338784,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5740125787335826,
"mc2_stderr": 0.015069171807224776
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.013921008595179349,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6813383788090022,
"acc_stderr": 0.0046500521500943935,
"acc_norm": 0.8697470623381797,
"acc_norm_stderr": 0.00335893627986726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.03279000406310051,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.03279000406310051
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853106,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.022489389793654817,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.022489389793654817
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8990825688073395,
"acc_stderr": 0.012914673545364432,
"acc_norm": 0.8990825688073395,
"acc_norm_stderr": 0.012914673545364432
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.032484700838071943,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.032484700838071943
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941646,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941646
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5743016759776536,
"acc_stderr": 0.016536829648997116,
"acc_norm": 0.5743016759776536,
"acc_norm_stderr": 0.016536829648997116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396147,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396147
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02576725201085596,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02576725201085596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427646,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705382,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705382
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.5740125787335826,
"mc2_stderr": 0.015069171807224776
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.01370849499567764
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_14m_thr_0.0_seed_1_t_1.0 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43723341
num_examples: 18928
- name: epoch_1
num_bytes: 44285617
num_examples: 18928
- name: epoch_2
num_bytes: 44368367
num_examples: 18928
- name: epoch_3
num_bytes: 44409265
num_examples: 18928
- name: epoch_4
num_bytes: 44407646
num_examples: 18928
- name: epoch_5
num_bytes: 44385418
num_examples: 18928
- name: epoch_6
num_bytes: 44373360
num_examples: 18928
- name: epoch_7
num_bytes: 44368533
num_examples: 18928
- name: epoch_8
num_bytes: 44365005
num_examples: 18928
- name: epoch_9
num_bytes: 44364145
num_examples: 18928
- name: epoch_10
num_bytes: 44364461
num_examples: 18928
- name: epoch_11
num_bytes: 44363418
num_examples: 18928
- name: epoch_12
num_bytes: 44364256
num_examples: 18928
- name: epoch_13
num_bytes: 44362384
num_examples: 18928
- name: epoch_14
num_bytes: 44363810
num_examples: 18928
- name: epoch_15
num_bytes: 44363104
num_examples: 18928
- name: epoch_16
num_bytes: 44362623
num_examples: 18928
- name: epoch_17
num_bytes: 44362595
num_examples: 18928
- name: epoch_18
num_bytes: 44363664
num_examples: 18928
- name: epoch_19
num_bytes: 44363081
num_examples: 18928
- name: epoch_20
num_bytes: 44363046
num_examples: 18928
- name: epoch_21
num_bytes: 44363630
num_examples: 18928
- name: epoch_22
num_bytes: 44362776
num_examples: 18928
- name: epoch_23
num_bytes: 44363335
num_examples: 18928
- name: epoch_24
num_bytes: 44363697
num_examples: 18928
- name: epoch_25
num_bytes: 44363018
num_examples: 18928
- name: epoch_26
num_bytes: 44363189
num_examples: 18928
- name: epoch_27
num_bytes: 44363280
num_examples: 18928
- name: epoch_28
num_bytes: 44363046
num_examples: 18928
- name: epoch_29
num_bytes: 44363735
num_examples: 18928
download_size: 700504945
dataset_size: 1330316845
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
open-llm-leaderboard/details_SuperAGI__SAM | ---
pretty_name: Evaluation run of SuperAGI/SAM
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SuperAGI/SAM](https://huggingface.co/SuperAGI/SAM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SuperAGI__SAM\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T11:58:05.499666](https://huggingface.co/datasets/open-llm-leaderboard/details_SuperAGI__SAM/blob/main/results_2023-12-29T11-58-05.499666.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6164371891665963,\n\
\ \"acc_stderr\": 0.032710001289537494,\n \"acc_norm\": 0.6244254048530119,\n\
\ \"acc_norm_stderr\": 0.033400446150554805,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720123,\n \"mc2\": 0.5263967146285616,\n\
\ \"mc2_stderr\": 0.015136951079391848\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804232,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6258713403704441,\n\
\ \"acc_stderr\": 0.0048290815328265015,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.0038076803311729033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091095,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091095\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3888268156424581,\n\
\ \"acc_stderr\": 0.016303899530796136,\n \"acc_norm\": 0.3888268156424581,\n\
\ \"acc_norm_stderr\": 0.016303899530796136\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379778,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379778\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720123,\n \"mc2\": 0.5263967146285616,\n\
\ \"mc2_stderr\": 0.015136951079391848\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275623\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22896133434420016,\n \
\ \"acc_stderr\": 0.011573412892418223\n }\n}\n```"
repo_url: https://huggingface.co/SuperAGI/SAM
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|arc:challenge|25_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|gsm8k|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hellaswag|10_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-58-05.499666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T11-58-05.499666.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- '**/details_harness|winogrande|5_2023-12-29T11-58-05.499666.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T11-58-05.499666.parquet'
- config_name: results
data_files:
- split: 2023_12_29T11_58_05.499666
path:
- results_2023-12-29T11-58-05.499666.parquet
- split: latest
path:
- results_2023-12-29T11-58-05.499666.parquet
---
# Dataset Card for Evaluation run of SuperAGI/SAM
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SuperAGI/SAM](https://huggingface.co/SuperAGI/SAM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SuperAGI__SAM",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T11:58:05.499666](https://huggingface.co/datasets/open-llm-leaderboard/details_SuperAGI__SAM/blob/main/results_2023-12-29T11-58-05.499666.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6164371891665963,
"acc_stderr": 0.032710001289537494,
"acc_norm": 0.6244254048530119,
"acc_norm_stderr": 0.033400446150554805,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720123,
"mc2": 0.5263967146285616,
"mc2_stderr": 0.015136951079391848
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804232,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6258713403704441,
"acc_stderr": 0.0048290815328265015,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.0038076803311729033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3888268156424581,
"acc_stderr": 0.016303899530796136,
"acc_norm": 0.3888268156424581,
"acc_norm_stderr": 0.016303899530796136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379778,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379778
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720123,
"mc2": 0.5263967146285616,
"mc2_stderr": 0.015136951079391848
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275623
},
"harness|gsm8k|5": {
"acc": 0.22896133434420016,
"acc_stderr": 0.011573412892418223
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tomaarsen/conllpp | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|conll2003
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: conll
pretty_name: CoNLL++
train-eval-index:
- config: conllpp
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
0: '"'
1: ''''''
2: '#'
3: $
4: (
5: )
6: ','
7: .
8: ':'
9: '``'
10: CC
11: CD
12: DT
13: EX
14: FW
15: IN
16: JJ
17: JJR
18: JJS
19: LS
20: MD
21: NN
22: NNP
23: NNPS
24: NNS
25: NN|SYM
26: PDT
27: POS
28: PRP
29: PRP$
30: RB
31: RBR
32: RBS
33: RP
34: SYM
35: TO
36: UH
37: VB
38: VBD
39: VBG
40: VBN
41: VBP
42: VBZ
43: WDT
44: WP
45: WP$
46: WRB
- name: chunk_tags
sequence:
class_label:
names:
0: O
1: B-ADJP
2: I-ADJP
3: B-ADVP
4: I-ADVP
5: B-CONJP
6: I-CONJP
7: B-INTJ
8: I-INTJ
9: B-LST
10: I-LST
11: B-NP
12: I-NP
13: B-PP
14: I-PP
15: B-PRT
16: I-PRT
17: B-SBAR
18: I-SBAR
19: B-UCP
20: I-UCP
21: B-VP
22: I-VP
- name: ner_tags
sequence:
class_label:
names:
0: O
1: B-PER
2: I-PER
3: B-ORG
4: I-ORG
5: B-LOC
6: I-LOC
7: B-MISC
8: I-MISC
config_name: conllpp
splits:
- name: train
num_bytes: 6931393
num_examples: 14041
- name: validation
num_bytes: 1739247
num_examples: 3250
- name: test
num_bytes: 1582078
num_examples: 3453
download_size: 4859600
dataset_size: 10252718
---
# Dataset Card for "conllpp"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/ZihanWangKi/CrossWeigh)
- **Repository:** [Github](https://github.com/ZihanWangKi/CrossWeigh)
- **Paper:** [Aclweb](https://www.aclweb.org/anthology/D19-1519)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
CoNLLpp is a corrected version of the CoNLL2003 NER dataset where labels of 5.38% of the sentences in the test set
have been manually corrected. The training set and development set from CoNLL2003 is included for completeness. One
correction on the test set for example, is:
```
{
"tokens": ["SOCCER", "-", "JAPAN", "GET", "LUCKY", "WIN", ",", "CHINA", "IN", "SURPRISE", "DEFEAT", "."],
"original_ner_tags_in_conll2003": ["O", "O", "B-LOC", "O", "O", "O", "O", "B-PER", "O", "O", "O", "O"],
"corrected_ner_tags_in_conllpp": ["O", "O", "B-LOC", "O", "O", "O", "O", "B-LOC", "O", "O", "O", "O"],
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
#### conllpp
- **Size of downloaded dataset files:** 4.85 MB
- **Size of the generated dataset:** 10.26 MB
- **Total amount of disk used:** 15.11 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "0",
"document_id": 1,
"sentence_id": 3,
"tokens": ["The", "European", "Commission", "said", "on", "Thursday", "it", "disagreed", "with", "German", "advice", "to", "consumers", "to", "shun", "British", "lamb", "until", "scientists", "determine", "whether", "mad", "cow", "disease", "can", "be", "transmitted", "to", "sheep", "."]
"pos_tags": [12, 22, 22, 38, 15, 22, 28, 38, 15, 16, 21, 35, 24, 35, 37, 16, 21, 15, 24, 41, 15, 16, 21, 21, 20, 37, 40, 35, 21, 7],
"ner_tags": [0, 3, 4, 0, 0, 0, 0, 0, 0, 7, 0, 0, 0, 0, 0, 7, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
"chunk_tags": [11, 12, 12, 21, 13, 11, 11, 21, 13, 11, 12, 13, 11, 21, 22, 11, 12, 17, 11, 21, 17, 11, 12, 12, 21, 22, 22, 13, 11, 0],
}
```
### Data Fields
The data fields are the same among all splits.
#### conllpp
- `id`: a `string` feature.
- `document_id`: an `int32` feature tracking which document the sample is from.
- `sentence_id`: an `int32` feature tracking which sentence in this document the sample is from.
- `tokens`: a `list` of `string` features.
- `pos_tags`: a `list` of classification labels, with possible values including `"` (0), `''` (1), `#` (2), `$` (3), `(` (4).
- `chunk_tags`: a `list` of classification labels, with possible values including `O` (0), `B-ADJP` (1), `I-ADJP` (2), `B-ADVP` (3), `I-ADVP` (4).
- `ner_tags`: a `list` of classification labels, with possible values including `O` (0), `B-PER` (1), `I-PER` (2), `B-ORG` (3), `I-ORG` (4).
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|conll2003|14041| 3250|3453|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{wang2019crossweigh,
title={CrossWeigh: Training Named Entity Tagger from Imperfect Annotations},
author={Wang, Zihan and Shang, Jingbo and Liu, Liyuan and Lu, Lihao and Liu, Jiacheng and Han, Jiawei},
booktitle={Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)},
pages={5157--5166},
year={2019}
}
```
### Contributions
Thanks to [@ZihanWangKi](https://github.com/ZihanWangKi) for adding this dataset. |
irds/neumarco_ru_train_judged | ---
pretty_name: '`neumarco/ru/train/judged`'
viewer: false
source_datasets: ['irds/neumarco_ru', 'irds/neumarco_ru_train']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/ru/train/judged`
The `neumarco/ru/train/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/ru/train/judged).
# Data
This dataset provides:
- `queries` (i.e., topics); count=502,939
- For `docs`, use [`irds/neumarco_ru`](https://huggingface.co/datasets/irds/neumarco_ru)
- For `qrels`, use [`irds/neumarco_ru_train`](https://huggingface.co/datasets/irds/neumarco_ru_train)
- For `docpairs`, use [`irds/neumarco_ru_train`](https://huggingface.co/datasets/irds/neumarco_ru_train)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_ru_train_judged', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
selimyagci/expert_annoatated_english | ---
license: unknown
---
|
kiddothe2b/contract-nli | ---
license: cc-by-nc-sa-4.0
---
|
valurank/12-factor | ---
license:
- other
language:
- en
multilinguality:
- monolingual
task_categories:
- classification
task_ids:
- classification
---
# Dataset Card for 12-factor
## Table of Contents
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Source Data](#source-data)
## Dataset Description
100+ news article URL scored on 12 different factors and assigned a single score
## Languages
The text in the dataset is in English
## Source Data
The dataset is manually scraped and annotated by Alex
|
davidfant/natural-questions-chunk-24 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4627990690
num_examples: 10000
download_size: 1797158593
dataset_size: 4627990690
---
# Dataset Card for "natural-questions-chunk-24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edwinjue/311-data-2021 | ---
license: gpl-3.0
---
|
benayas/atis_chatgpt_10pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 434484
num_examples: 4455
download_size: 147384
dataset_size: 434484
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tracygu/dnscustom | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b | ---
pretty_name: Evaluation run of ibivibiv/strix-rufipes-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ibivibiv/strix-rufipes-70b](https://huggingface.co/ibivibiv/strix-rufipes-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T03:56:38.126146](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b/blob/main/results_2024-01-23T03-56-38.126146.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6910894247381432,\n\
\ \"acc_stderr\": 0.03074765419800289,\n \"acc_norm\": 0.6948907257668793,\n\
\ \"acc_norm_stderr\": 0.03135484817423871,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5672072620719157,\n\
\ \"mc2_stderr\": 0.014748654287331589\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693246,\n\
\ \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6898028281218881,\n\
\ \"acc_stderr\": 0.004616288245259753,\n \"acc_norm\": 0.8786098386775543,\n\
\ \"acc_norm_stderr\": 0.003259127057668171\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.03036358219723817,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.03036358219723817\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778387,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942088,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942088\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.0228158130988966,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.0228158130988966\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"\
acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n\
\ \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\": 0.9019607843137255,\n\
\ \"acc_norm_stderr\": 0.0208711184555521\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n\
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.02779017706438359,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.02779017706438359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476075,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476075\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044578,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044578\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867464,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867464\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8569604086845466,\n\
\ \"acc_stderr\": 0.012520023176796534,\n \"acc_norm\": 0.8569604086845466,\n\
\ \"acc_norm_stderr\": 0.012520023176796534\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5184357541899441,\n\
\ \"acc_stderr\": 0.01671113049778282,\n \"acc_norm\": 0.5184357541899441,\n\
\ \"acc_norm_stderr\": 0.01671113049778282\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157375,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5743155149934811,\n\
\ \"acc_stderr\": 0.01262839355181194,\n \"acc_norm\": 0.5743155149934811,\n\
\ \"acc_norm_stderr\": 0.01262839355181194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.01736247376214662,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.01736247376214662\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827054,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5672072620719157,\n\
\ \"mc2_stderr\": 0.014748654287331589\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065614\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5382865807429871,\n \
\ \"acc_stderr\": 0.01373204822701668\n }\n}\n```"
repo_url: https://huggingface.co/ibivibiv/strix-rufipes-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|arc:challenge|25_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|gsm8k|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hellaswag|10_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T03-56-38.126146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T03-56-38.126146.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- '**/details_harness|winogrande|5_2024-01-23T03-56-38.126146.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T03-56-38.126146.parquet'
- config_name: results
data_files:
- split: 2024_01_23T03_56_38.126146
path:
- results_2024-01-23T03-56-38.126146.parquet
- split: latest
path:
- results_2024-01-23T03-56-38.126146.parquet
---
# Dataset Card for Evaluation run of ibivibiv/strix-rufipes-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibivibiv/strix-rufipes-70b](https://huggingface.co/ibivibiv/strix-rufipes-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T03:56:38.126146](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b/blob/main/results_2024-01-23T03-56-38.126146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6910894247381432,
"acc_stderr": 0.03074765419800289,
"acc_norm": 0.6948907257668793,
"acc_norm_stderr": 0.03135484817423871,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5672072620719157,
"mc2_stderr": 0.014748654287331589
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693246,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274777
},
"harness|hellaswag|10": {
"acc": 0.6898028281218881,
"acc_stderr": 0.004616288245259753,
"acc_norm": 0.8786098386775543,
"acc_norm_stderr": 0.003259127057668171
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778387,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942088,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942088
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.0228158130988966,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.0228158130988966
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.02779017706438359,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.02779017706438359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476075,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476075
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044578,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044578
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867464,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867464
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8569604086845466,
"acc_stderr": 0.012520023176796534,
"acc_norm": 0.8569604086845466,
"acc_norm_stderr": 0.012520023176796534
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5184357541899441,
"acc_stderr": 0.01671113049778282,
"acc_norm": 0.5184357541899441,
"acc_norm_stderr": 0.01671113049778282
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157375,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5390070921985816,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.5390070921985816,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5743155149934811,
"acc_stderr": 0.01262839355181194,
"acc_norm": 0.5743155149934811,
"acc_norm_stderr": 0.01262839355181194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.01736247376214662,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.01736247376214662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827054,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5672072620719157,
"mc2_stderr": 0.014748654287331589
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065614
},
"harness|gsm8k|5": {
"acc": 0.5382865807429871,
"acc_stderr": 0.01373204822701668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Falah/national_geographic_photography_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 145497
num_examples: 1000
download_size: 5368
dataset_size: 145497
---
# Dataset Card for "national_geographic_photography_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
m-a-p/SciMMIR | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: file_name_index
dtype: string
- name: text
dtype: string
- name: class
dtype: string
- name: super_class
dtype: string
- name: sub_class
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 59242453844.635
num_examples: 498279
- name: validation
num_bytes: 1783636593.843
num_examples: 16433
- name: test
num_bytes: 1874022111.346
num_examples: 16263
download_size: 63729889852
dataset_size: 62900112549.824005
---
# Dataset Card for "SciMMIR_dataset"
## SciMMIR
This is the repo for the paper [SciMMIR: Benchmarking Scientific Multi-modal Information Retrieval](https://arxiv.org/abs/2401.13478).

In this paper, we propose a novel SciMMIR benchmark and a corresponding dataset designed to address the gap in evaluating multi-modal information retrieval (MMIR) models in the scientific domain.
It is worth mentioning that we define a data hierarchical architecture of "Two subsets, Five subcategories" and use human-created keywords to classify the data (as shown in the table below).

As shown in the table below, we conducted extensive baselines (both fine-tuning and zero-shot) within various subsets and subcategories.

For more detailed experimental results and analysis, please refer to our paper [SciMMIR](https://arxiv.org/abs/2401.13478).
## Dataset
Our SciMMIR benchmark dataset used in this paper contains 537K scientific image-text pairs which are extracted from the latest 6 months' papers in Arxiv (2023.05 to 2023.10), and we will continue to expand this data by extracting data from more papers in Arxiv and provide larger versions of the dataset.
The datasets can be obtained from huggingface Datasets [m-a-p/SciMMIR](https://huggingface.co/datasets/m-a-p/SciMMIR), and the following codes show how to use it:
```python
import datasets
ds_remote = datasets.load_dataset("m-a-p/SciMMIR")
test_data = ds_remote['test']
caption = test_data[0]['text']
image_type = test_data[0]['class']
image = test_data[0]['image']
```
## Codes
The codes of this paper can be found in our [Github](https://github.com/Wusiwei0410/SciMMIR)
## Potential TODOs before ACL
**TODO**: case study table
**TODO**: statistics of the paper fields (perhaps in appendix)
**TODO**: See if it's possible to further divide the "Figure Results" subsets.
## Citation
```
@misc{wu2024scimmir,
title={SciMMIR: Benchmarking Scientific Multi-modal Information Retrieval},
author={Siwei Wu and Yizhi Li and Kang Zhu and Ge Zhang and Yiming Liang and Kaijing Ma and Chenghao Xiao and Haoran Zhang and Bohao Yang and Wenhu Chen and Wenhao Huang and Noura Al Moubayed and Jie Fu and Chenghua Lin},
year={2024},
eprint={2401.13478},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/takamine_noa_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of takamine_noa/高峯のあ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of takamine_noa/高峯のあ (THE iDOLM@STER: Cinderella Girls), containing 102 images and their tags.
The core tags of this character are `long_hair, breasts, grey_hair, brown_eyes, large_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 102 | 85.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takamine_noa_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 102 | 61.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takamine_noa_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 204 | 112.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takamine_noa_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 102 | 77.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takamine_noa_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 204 | 139.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/takamine_noa_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/takamine_noa_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, card_(medium), character_name, gem_(symbol), solo, star_(symbol), jewelry, belt, detached_sleeves, dress, looking_at_viewer, thighhighs |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, solo, bridal_gauntlets, red_eyes, smile, dress, fingerless_gloves, jewelry, medium_breasts |
| 2 | 20 |  |  |  |  |  | 1girl, solo, looking_at_viewer, cleavage, playboy_bunny, rabbit_ears, bangs, detached_collar, wrist_cuffs, bare_shoulders, bowtie, fake_animal_ears, hair_between_eyes, jewelry, white_background, simple_background, closed_mouth, fishnet_pantyhose, hairband, black_leotard, smile, yellow_bow |
| 3 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, simple_background, white_background, cowboy_shot, medium_breasts, closed_mouth, jewelry, one-piece_swimsuit, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | card_(medium) | character_name | gem_(symbol) | solo | star_(symbol) | jewelry | belt | detached_sleeves | dress | looking_at_viewer | thighhighs | bare_shoulders | bridal_gauntlets | red_eyes | smile | fingerless_gloves | medium_breasts | cleavage | playboy_bunny | rabbit_ears | bangs | detached_collar | wrist_cuffs | bowtie | fake_animal_ears | hair_between_eyes | white_background | simple_background | closed_mouth | fishnet_pantyhose | hairband | black_leotard | yellow_bow | cowboy_shot | one-piece_swimsuit |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------------|:---------------|:-------|:----------------|:----------|:-------|:-------------------|:--------|:--------------------|:-------------|:-----------------|:-------------------|:-----------|:--------|:--------------------|:-----------------|:-----------|:----------------|:--------------|:--------|:------------------|:--------------|:---------|:-------------------|:--------------------|:-------------------|:--------------------|:---------------|:--------------------|:-----------|:----------------|:-------------|:--------------|:---------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | X | | X | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | | | X | | X | | | | X | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | |
| 3 | 12 |  |  |  |  |  | X | | | | X | | X | | | | X | | | | | X | | X | | | | | | | | | | X | X | X | | | | | X | X |
|
Pablao0948/Joe_Goldberg | ---
license: openrail
---
|
CyberHarem/shiraki_hime_watashinoyuriwaoshigotodesu | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shiraki Hime
This is the dataset of Shiraki Hime, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 655 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 655 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 655 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 655 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-2-3 | ---
pretty_name: Evaluation run of juhwanlee/gemma-7B-alpaca-case-2-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [juhwanlee/gemma-7B-alpaca-case-2-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-2-3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-2-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T18:29:11.869374](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-2-3/blob/main/results_2024-03-27T18-29-11.869374.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26908811376582065,\n\
\ \"acc_stderr\": 0.031454313335312636,\n \"acc_norm\": 0.2701685926335597,\n\
\ \"acc_norm_stderr\": 0.032293322700319135,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427678,\n \"mc2\": 0.4798747402340299,\n\
\ \"mc2_stderr\": 0.0170283180875269\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22013651877133106,\n \"acc_stderr\": 0.01210812488346098,\n\
\ \"acc_norm\": 0.25597269624573377,\n \"acc_norm_stderr\": 0.012753013241244518\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2570205138418642,\n\
\ \"acc_stderr\": 0.004360977256058731,\n \"acc_norm\": 0.2566221868153754,\n\
\ \"acc_norm_stderr\": 0.004358764596401032\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.026148818018424513,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.026148818018424513\n \
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.02366129639396428,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886835,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886835\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"\
acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993656,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.14349775784753363,\n\
\ \"acc_stderr\": 0.02352937126961818,\n \"acc_norm\": 0.14349775784753363,\n\
\ \"acc_norm_stderr\": 0.02352937126961818\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854932,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n\
\ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n\
\ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n\
\ \"acc_stderr\": 0.02608270069539965,\n \"acc_norm\": 0.3022508038585209,\n\
\ \"acc_norm_stderr\": 0.02608270069539965\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140228,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140228\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25945241199478486,\n\
\ \"acc_stderr\": 0.01119526207635031,\n \"acc_norm\": 0.25945241199478486,\n\
\ \"acc_norm_stderr\": 0.01119526207635031\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.018020474148393577,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.018020474148393577\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1746987951807229,\n\
\ \"acc_stderr\": 0.02956032621125684,\n \"acc_norm\": 0.1746987951807229,\n\
\ \"acc_norm_stderr\": 0.02956032621125684\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.0356507967070831,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.0356507967070831\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427678,\n \"mc2\": 0.4798747402340299,\n\
\ \"mc2_stderr\": 0.0170283180875269\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5098658247829518,\n \"acc_stderr\": 0.014049749833367592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-2-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-29-11.869374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T18-29-11.869374.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- '**/details_harness|winogrande|5_2024-03-27T18-29-11.869374.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T18-29-11.869374.parquet'
- config_name: results
data_files:
- split: 2024_03_27T18_29_11.869374
path:
- results_2024-03-27T18-29-11.869374.parquet
- split: latest
path:
- results_2024-03-27T18-29-11.869374.parquet
---
# Dataset Card for Evaluation run of juhwanlee/gemma-7B-alpaca-case-2-3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [juhwanlee/gemma-7B-alpaca-case-2-3](https://huggingface.co/juhwanlee/gemma-7B-alpaca-case-2-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-2-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T18:29:11.869374](https://huggingface.co/datasets/open-llm-leaderboard/details_juhwanlee__gemma-7B-alpaca-case-2-3/blob/main/results_2024-03-27T18-29-11.869374.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26908811376582065,
"acc_stderr": 0.031454313335312636,
"acc_norm": 0.2701685926335597,
"acc_norm_stderr": 0.032293322700319135,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427678,
"mc2": 0.4798747402340299,
"mc2_stderr": 0.0170283180875269
},
"harness|arc:challenge|25": {
"acc": 0.22013651877133106,
"acc_stderr": 0.01210812488346098,
"acc_norm": 0.25597269624573377,
"acc_norm_stderr": 0.012753013241244518
},
"harness|hellaswag|10": {
"acc": 0.2570205138418642,
"acc_stderr": 0.004360977256058731,
"acc_norm": 0.2566221868153754,
"acc_norm_stderr": 0.004358764596401032
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2,
"acc_stderr": 0.026148818018424513,
"acc_norm": 0.2,
"acc_norm_stderr": 0.026148818018424513
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970187,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970187
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089116,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089116
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886835,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886835
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993656,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.14349775784753363,
"acc_stderr": 0.02352937126961818,
"acc_norm": 0.14349775784753363,
"acc_norm_stderr": 0.02352937126961818
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854932,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.02608270069539965,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.02608270069539965
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140228,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140228
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25945241199478486,
"acc_stderr": 0.01119526207635031,
"acc_norm": 0.25945241199478486,
"acc_norm_stderr": 0.01119526207635031
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.018020474148393577,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.018020474148393577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1746987951807229,
"acc_stderr": 0.02956032621125684,
"acc_norm": 0.1746987951807229,
"acc_norm_stderr": 0.02956032621125684
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.0356507967070831,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.0356507967070831
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427678,
"mc2": 0.4798747402340299,
"mc2_stderr": 0.0170283180875269
},
"harness|winogrande|5": {
"acc": 0.5098658247829518,
"acc_stderr": 0.014049749833367592
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.