datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ssbuild/alpaca_guanaco | ---
license: apache-2.0
---
|
arbml/Author_Attribution_Tweets | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: author
dtype: string
splits:
- name: test
num_bytes: 2629687
num_examples: 13341
- name: train
num_bytes: 10441650
num_examples: 53198
download_size: 6482998
dataset_size: 13071337
---
# Dataset Card for "Author_Attribution_Tweets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ouvic215/Soldering-Data-pix2pix-1209-white-1 | ---
dataset_info:
features:
- name: mask_image
dtype: image
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 511555221.25
num_examples: 6799
download_size: 510366317
dataset_size: 511555221.25
---
# Dataset Card for "Soldering-Data-pix2pix-1209-white-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AppleHarem/paprika_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of paprika (Arknights)
This is the dataset of paprika (Arknights), containing 13 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 13 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 33 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 40 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 13 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 13 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 13 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 33 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 33 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 24 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 40 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 40 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
OptimOS/Selfies | ---
license: unknown
---
|
HuggingFaceM4/imagenet1k_support_5k_query_sets | Invalid username or password. |
jamestalentium/xsum_10_finetune | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 23485.327403268886
num_examples: 10
download_size: 19146
dataset_size: 23485.327403268886
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xsum_10_finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/quirky_sciq_pythia-410m | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 29103976
num_examples: 46716
- name: validation
num_bytes: 2464470
num_examples: 4000
- name: test
num_bytes: 2510666
num_examples: 4000
download_size: 7307630
dataset_size: 34079112
---
# Dataset Card for "quirky_sciq_pythia-410m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ColinCcz/processed_mental_data | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 33504534
num_examples: 34823
- name: validation
num_bytes: 8336083
num_examples: 8706
- name: test
num_bytes: 10326719
num_examples: 10883
download_size: 31547813
dataset_size: 52167336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Yus287/y-github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 19943411.093366094
num_examples: 3907
- name: test
num_bytes: 1000488.5012285012
num_examples: 196
- name: val
num_bytes: 3986640.4054054054
num_examples: 781
download_size: 7953657
dataset_size: 24930540.0
---
# Dataset Card for "y-github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nevo399/orangecocoa | ---
license: openrail
---
|
FreedomIntelligence/MMLU_Italian | ---
license: mit
---
Italian version of MMLU dataset tranlasted by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
ludekcizinsky/epfl-cs502-hw3 | ---
license: mit
---
|
andersonbcdefg/captions_triples_unfiltered_bm25 | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 40294446
num_examples: 229311
download_size: 21806078
dataset_size: 40294446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora | ---
pretty_name: Evaluation run of JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora](https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T23:59:10.991747](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora/blob/main/results_2024-03-27T23-59-10.991747.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.582160493650072,\n\
\ \"acc_stderr\": 0.03365366470977369,\n \"acc_norm\": 0.5887215545601854,\n\
\ \"acc_norm_stderr\": 0.03434852603389613,\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6831698375539644,\n\
\ \"mc2_stderr\": 0.015593330487456654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6102370045807608,\n\
\ \"acc_stderr\": 0.004866997110388195,\n \"acc_norm\": 0.7969527982473611,\n\
\ \"acc_norm_stderr\": 0.0040144524737232646\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n\
\ \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726368,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726368\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132274,\n \"\
acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240648,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240648\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748929,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748929\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101081,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101081\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520439,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520439\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236855,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n\
\ \"acc_stderr\": 0.012573836633799015,\n \"acc_norm\": 0.41264667535853977,\n\
\ \"acc_norm_stderr\": 0.012573836633799015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.02962466358115969,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.02962466358115969\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.0301164262965406,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.0301164262965406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.034104105654953025,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.034104105654953025\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5177478580171359,\n\
\ \"mc1_stderr\": 0.017492470843075356,\n \"mc2\": 0.6831698375539644,\n\
\ \"mc2_stderr\": 0.015593330487456654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7032359905288083,\n \"acc_stderr\": 0.012839239695202025\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.288855193328279,\n \
\ \"acc_stderr\": 0.012484219800126673\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-59-10.991747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-59-10.991747.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- '**/details_harness|winogrande|5_2024-03-27T23-59-10.991747.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T23-59-10.991747.parquet'
- config_name: results
data_files:
- split: 2024_03_27T23_59_10.991747
path:
- results_2024-03-27T23-59-10.991747.parquet
- split: latest
path:
- results_2024-03-27T23-59-10.991747.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora](https://huggingface.co/JunchengXie/Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T23:59:10.991747](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-Instruct-v0.2-gpt-4-80k-base_lora/blob/main/results_2024-03-27T23-59-10.991747.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.582160493650072,
"acc_stderr": 0.03365366470977369,
"acc_norm": 0.5887215545601854,
"acc_norm_stderr": 0.03434852603389613,
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6831698375539644,
"mc2_stderr": 0.015593330487456654
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.5947098976109215,
"acc_norm_stderr": 0.014346869060229321
},
"harness|hellaswag|10": {
"acc": 0.6102370045807608,
"acc_stderr": 0.004866997110388195,
"acc_norm": 0.7969527982473611,
"acc_norm_stderr": 0.0040144524737232646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726368,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726368
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240648,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240648
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748929,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748929
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101081,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101081
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520439,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520439
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236855,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799015,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.02962466358115969,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.02962466358115969
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.0301164262965406,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.0301164262965406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.034104105654953025,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.034104105654953025
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5177478580171359,
"mc1_stderr": 0.017492470843075356,
"mc2": 0.6831698375539644,
"mc2_stderr": 0.015593330487456654
},
"harness|winogrande|5": {
"acc": 0.7032359905288083,
"acc_stderr": 0.012839239695202025
},
"harness|gsm8k|5": {
"acc": 0.288855193328279,
"acc_stderr": 0.012484219800126673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
quocanh34/HNAG_new_cut_final | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: w2v2_transcription
dtype: string
- name: WER
dtype: int64
splits:
- name: train
num_bytes: 17336493.0
num_examples: 221
download_size: 17334343
dataset_size: 17336493.0
---
# Dataset Card for "HNAG_new_cut_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__Wizard-Vicuna-30B-Uncensored | ---
pretty_name: Evaluation run of ehartford/Wizard-Vicuna-30B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Wizard-Vicuna-30B-Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Wizard-Vicuna-30B-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T12:57:01.368480](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Wizard-Vicuna-30B-Uncensored/blob/main/results_2023-10-18T12-57-01.368480.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.18162751677852348,\n\
\ \"em_stderr\": 0.0039482621737543045,\n \"f1\": 0.2674087667785243,\n\
\ \"f1_stderr\": 0.004012090110572664,\n \"acc\": 0.46353130406008236,\n\
\ \"acc_stderr\": 0.01059244186586655\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.18162751677852348,\n \"em_stderr\": 0.0039482621737543045,\n\
\ \"f1\": 0.2674087667785243,\n \"f1_stderr\": 0.004012090110572664\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \
\ \"acc_stderr\": 0.009629588445673819\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059279\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T12_57_01.368480
path:
- '**/details_harness|drop|3_2023-10-18T12-57-01.368480.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T12-57-01.368480.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T12_57_01.368480
path:
- '**/details_harness|gsm8k|5_2023-10-18T12-57-01.368480.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T12-57-01.368480.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:31:27.283689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:31:27.283689.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:31:27.283689.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T12_57_01.368480
path:
- '**/details_harness|winogrande|5_2023-10-18T12-57-01.368480.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T12-57-01.368480.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_31_27.283689
path:
- results_2023-07-19T22:31:27.283689.parquet
- split: 2023_10_18T12_57_01.368480
path:
- results_2023-10-18T12-57-01.368480.parquet
- split: latest
path:
- results_2023-10-18T12-57-01.368480.parquet
---
# Dataset Card for Evaluation run of ehartford/Wizard-Vicuna-30B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Wizard-Vicuna-30B-Uncensored](https://huggingface.co/ehartford/Wizard-Vicuna-30B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Wizard-Vicuna-30B-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T12:57:01.368480](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Wizard-Vicuna-30B-Uncensored/blob/main/results_2023-10-18T12-57-01.368480.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.18162751677852348,
"em_stderr": 0.0039482621737543045,
"f1": 0.2674087667785243,
"f1_stderr": 0.004012090110572664,
"acc": 0.46353130406008236,
"acc_stderr": 0.01059244186586655
},
"harness|drop|3": {
"em": 0.18162751677852348,
"em_stderr": 0.0039482621737543045,
"f1": 0.2674087667785243,
"f1_stderr": 0.004012090110572664
},
"harness|gsm8k|5": {
"acc": 0.1425322213798332,
"acc_stderr": 0.009629588445673819
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059279
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aureliojafer/twitter_dataset_1709851292 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
splits:
- name: train
num_bytes: 95669
num_examples: 315
download_size: 58873
dataset_size: 95669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_those_them | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 57212
num_examples: 218
- name: dev_mismatched
num_bytes: 78118
num_examples: 274
- name: test_matched
num_bytes: 57699
num_examples: 220
- name: test_mismatched
num_bytes: 59977
num_examples: 210
- name: train
num_bytes: 2021086
num_examples: 7730
download_size: 1301637
dataset_size: 2274092
---
# Dataset Card for "MULTI_VALUE_mnli_those_them"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sorenmulli/da-hashtag-twitterhjerne | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer 1
dtype: string
- name: Answer 2
dtype: string
- name: Answer 3
dtype: string
- name: Answer 4
dtype: string
- name: Answer 5
dtype: string
- name: Answer 6
dtype: string
- name: 'Unnamed: 8'
dtype: string
- name: 'Unnamed: 9'
dtype: string
splits:
- name: train
num_bytes: 51635
num_examples: 78
download_size: 50291
dataset_size: 51635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# [WIP] Dataset Card for "da-hashtag-twitterhjerne"
*Please note that this dataset and dataset card both are works in progress. For now refer to the related [thesis](https://sorenmulli.github.io/thesis/thesis.pdf) for all details*
|
richardr1126/spider-skeleton-context-instruct | ---
language:
- en
license:
- cc-by-4.0
source_datasets:
- spider
pretty_name: Spider Skeleton Context Instruct
tags:
- text-to-sql
- SQL
- Spider
- fine-tune
dataset_info:
features:
- name: db_id
dtype: string
- name: text
dtype: string
---
# Dataset Card for Spider Skeleton Context Instruct
### Dataset Summary
Spider is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
This dataset was created to finetune LLMs in a `### Instruction:` and `### Response:` format with database context.
### Yale Lily Spider Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
### Citation
```
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
}
``` |
samrm/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MITCriticalData/SAT1_dataset_5_top_cities | ---
license: mit
---
|
jonasantos5240/marvin | ---
license: openrail
---
|
xwjiang2010/pile_dedupe_val | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6062337711
num_examples: 1000000
download_size: 3343428302
dataset_size: 6062337711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pile_dedupe_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1 | ---
pretty_name: Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1](https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T18:49:52.400292](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1/blob/main/results_2023-12-09T18-49-52.400292.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2882361931913223,\n\
\ \"acc_stderr\": 0.031895486998552665,\n \"acc_norm\": 0.2903928342274915,\n\
\ \"acc_norm_stderr\": 0.03267939625046512,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006535,\n \"mc2\": 0.41227748774876055,\n\
\ \"mc2_stderr\": 0.014572961912704371\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3660409556313993,\n \"acc_stderr\": 0.01407722310847014,\n\
\ \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5142401911969727,\n\
\ \"acc_stderr\": 0.00498775731476984,\n \"acc_norm\": 0.7007568213503286,\n\
\ \"acc_norm_stderr\": 0.00456990648509029\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.030251237579213174,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.030251237579213174\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.020940481565334835,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.020940481565334835\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n\
\ \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.24516129032258063,\n\
\ \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02144454730156047,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02144454730156047\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279496,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279496\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28990825688073396,\n \"acc_stderr\": 0.019453066609201597,\n \"\
acc_norm\": 0.28990825688073396,\n \"acc_norm_stderr\": 0.019453066609201597\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.29596412556053814,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n\
\ \"acc_stderr\": 0.030463656747340268,\n \"acc_norm\": 0.3162393162393162,\n\
\ \"acc_norm_stderr\": 0.030463656747340268\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.37292464878671777,\n\
\ \"acc_stderr\": 0.017292868269453924,\n \"acc_norm\": 0.37292464878671777,\n\
\ \"acc_norm_stderr\": 0.017292868269453924\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210742,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3271604938271605,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.3271604938271605,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28096479791395046,\n\
\ \"acc_stderr\": 0.011479684550077692,\n \"acc_norm\": 0.28096479791395046,\n\
\ \"acc_norm_stderr\": 0.011479684550077692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2875816993464052,\n \"acc_stderr\": 0.018311653053648222,\n \
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.018311653053648222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145287,\n\
\ \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n\
\ \"acc_stderr\": 0.033076159479790326,\n \"acc_norm\": 0.32338308457711445,\n\
\ \"acc_norm_stderr\": 0.033076159479790326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006535,\n \"mc2\": 0.41227748774876055,\n\
\ \"mc2_stderr\": 0.014572961912704371\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.013402073680850515\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848635\n }\n}\n```"
repo_url: https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T18-49-52.400292.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- '**/details_harness|winogrande|5_2023-12-09T18-49-52.400292.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T18-49-52.400292.parquet'
- config_name: results
data_files:
- split: 2023_12_09T18_49_52.400292
path:
- results_2023-12-09T18-49-52.400292.parquet
- split: latest
path:
- results_2023-12-09T18-49-52.400292.parquet
---
# Dataset Card for Evaluation run of mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1](https://huggingface.co/mwitiderrick/shearedplats-2.7b-v2-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T18:49:52.400292](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__shearedplats-2.7b-v2-instruct-v0.1/blob/main/results_2023-12-09T18-49-52.400292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2882361931913223,
"acc_stderr": 0.031895486998552665,
"acc_norm": 0.2903928342274915,
"acc_norm_stderr": 0.03267939625046512,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006535,
"mc2": 0.41227748774876055,
"mc2_stderr": 0.014572961912704371
},
"harness|arc:challenge|25": {
"acc": 0.3660409556313993,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.40187713310580203,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.5142401911969727,
"acc_stderr": 0.00498775731476984,
"acc_norm": 0.7007568213503286,
"acc_norm_stderr": 0.00456990648509029
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.030251237579213174,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.030251237579213174
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.020940481565334835,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.020940481565334835
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02144454730156047,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02144454730156047
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279496,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279496
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28990825688073396,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.28990825688073396,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340268,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.37292464878671777,
"acc_stderr": 0.017292868269453924,
"acc_norm": 0.37292464878671777,
"acc_norm_stderr": 0.017292868269453924
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210742,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3271604938271605,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.3271604938271605,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503793,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28096479791395046,
"acc_stderr": 0.011479684550077692,
"acc_norm": 0.28096479791395046,
"acc_norm_stderr": 0.011479684550077692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145287,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.033076159479790326,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.033076159479790326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006535,
"mc2": 0.41227748774876055,
"mc2_stderr": 0.014572961912704371
},
"harness|winogrande|5": {
"acc": 0.6503551696921863,
"acc_stderr": 0.013402073680850515
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/wiki_find_passage_train100_eval40_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 158210
num_examples: 240
- name: validation
num_bytes: 33332
num_examples: 40
download_size: 95417
dataset_size: 191542
---
# Dataset Card for "wiki_find_passage_train100_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ighoshsubho/llama_mistral_dataset | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 82923
num_examples: 434
download_size: 37474
dataset_size: 82923
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_allenai__tulu-2-dpo-70b | ---
pretty_name: Evaluation run of allenai/tulu-2-dpo-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allenai/tulu-2-dpo-70b](https://huggingface.co/allenai/tulu-2-dpo-70b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allenai__tulu-2-dpo-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T06:48:43.589029](https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__tulu-2-dpo-70b/blob/main/results_2024-02-02T06-48-43.589029.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.699296658680991,\n\
\ \"acc_stderr\": 0.03051571429129605,\n \"acc_norm\": 0.7020037559735633,\n\
\ \"acc_norm_stderr\": 0.031114133505086575,\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6577655722264159,\n\
\ \"mc2_stderr\": 0.014903281756393213\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360134\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7082254530969926,\n\
\ \"acc_stderr\": 0.004536500714147989,\n \"acc_norm\": 0.8898625771758614,\n\
\ \"acc_norm_stderr\": 0.0031242116171988606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7509433962264151,\n \"acc_stderr\": 0.02661648298050171,\n\
\ \"acc_norm\": 0.7509433962264151,\n \"acc_norm_stderr\": 0.02661648298050171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044037,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044037\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.03001755447188056,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.03001755447188056\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821676,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821676\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.01742697415424052,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.01742697415424052\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.022421273612923707,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.022421273612923707\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.025859164122051453,\n\
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.025859164122051453\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8935779816513761,\n \"acc_stderr\": 0.013221554674594372,\n \"\
acc_norm\": 0.8935779816513761,\n \"acc_norm_stderr\": 0.013221554674594372\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094713,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\"\
: 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867457,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867457\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n\
\ \"acc_stderr\": 0.012658201736147288,\n \"acc_norm\": 0.8531289910600255,\n\
\ \"acc_norm_stderr\": 0.012658201736147288\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.511731843575419,\n\
\ \"acc_stderr\": 0.016717897676932162,\n \"acc_norm\": 0.511731843575419,\n\
\ \"acc_norm_stderr\": 0.016717897676932162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.02133086876212706,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.02133086876212706\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.546284224250326,\n\
\ \"acc_stderr\": 0.012715404841277752,\n \"acc_norm\": 0.546284224250326,\n\
\ \"acc_norm_stderr\": 0.012715404841277752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.01728276069516741,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.01728276069516741\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.7727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896175,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896175\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327691,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327691\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6577655722264159,\n\
\ \"mc2_stderr\": 0.014903281756393213\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6262319939347991,\n \
\ \"acc_stderr\": 0.013326342860737007\n }\n}\n```"
repo_url: https://huggingface.co/allenai/tulu-2-dpo-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|arc:challenge|25_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|gsm8k|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hellaswag|10_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T06-48-43.589029.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T06-48-43.589029.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- '**/details_harness|winogrande|5_2024-02-02T06-48-43.589029.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T06-48-43.589029.parquet'
- config_name: results
data_files:
- split: 2024_02_02T06_48_43.589029
path:
- results_2024-02-02T06-48-43.589029.parquet
- split: latest
path:
- results_2024-02-02T06-48-43.589029.parquet
---
# Dataset Card for Evaluation run of allenai/tulu-2-dpo-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allenai/tulu-2-dpo-70b](https://huggingface.co/allenai/tulu-2-dpo-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allenai__tulu-2-dpo-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T06:48:43.589029](https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__tulu-2-dpo-70b/blob/main/results_2024-02-02T06-48-43.589029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.699296658680991,
"acc_stderr": 0.03051571429129605,
"acc_norm": 0.7020037559735633,
"acc_norm_stderr": 0.031114133505086575,
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6577655722264159,
"mc2_stderr": 0.014903281756393213
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.01310678488360134
},
"harness|hellaswag|10": {
"acc": 0.7082254530969926,
"acc_stderr": 0.004536500714147989,
"acc_norm": 0.8898625771758614,
"acc_norm_stderr": 0.0031242116171988606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7509433962264151,
"acc_stderr": 0.02661648298050171,
"acc_norm": 0.7509433962264151,
"acc_norm_stderr": 0.02661648298050171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044037,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044037
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.03001755447188056,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.03001755447188056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821676,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821676
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.01742697415424052,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.01742697415424052
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.022421273612923707,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.022421273612923707
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.025859164122051453,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.025859164122051453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094713,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371037,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371037
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867457,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867457
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147288,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147288
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.511731843575419,
"acc_stderr": 0.016717897676932162,
"acc_norm": 0.511731843575419,
"acc_norm_stderr": 0.016717897676932162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.02133086876212706,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.02133086876212706
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.546284224250326,
"acc_stderr": 0.012715404841277752,
"acc_norm": 0.546284224250326,
"acc_norm_stderr": 0.012715404841277752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.01728276069516741,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.01728276069516741
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.027049257915896175,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.027049257915896175
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327691,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327691
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6577655722264159,
"mc2_stderr": 0.014903281756393213
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
},
"harness|gsm8k|5": {
"acc": 0.6262319939347991,
"acc_stderr": 0.013326342860737007
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tollefj/massive-en-no-shorter-transfer | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
splits:
- name: train
num_bytes: 44628652
num_examples: 758144
download_size: 33446436
dataset_size: 44628652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc
task_categories:
- translation
- summarization
language:
- 'no'
- nb
- en
pretty_name: Massive EN-NO shorter transfer
size_categories:
- 100K<n<1M
---
# Massive EN-NO shorter and similar transfer
A dataset of EN-NO translations comprised of the following sources:
- https://huggingface.co/datasets/opus100
- https://huggingface.co/datasets/opus_books
- https://huggingface.co/datasets/open_subtitles (https://huggingface.co/datasets/tollefj/subtitles-en-no-similar-shorter)
- https://huggingface.co/datasets/RuterNorway/Fleurs-Alpaca-EN-NO
And parsed by:
- simple preprocessing: stripping/misplaced punctuation
- computing all similarities with https://huggingface.co/NbAiLab/nb-sbert-base
- effectively aligning the translations
- filters out where the length of the target language (norwegian) is less than 70% the length of the source language (english)
- items with less than 6 words are passed regardless of length constraints
this results in a shorter and similar translation corpus. |
EleutherAI/quirky_subtraction_increment0_bob | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 12663979.0
num_examples: 192000
- name: validation
num_bytes: 263906.0
num_examples: 4000
- name: test
num_bytes: 263762.0
num_examples: 4000
download_size: 4073079
dataset_size: 13191647.0
---
# Dataset Card for "quirky_subtraction_increment0_bob"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gbssreejith/Sm_Type1_dataset_finetuned1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 45846795.0
num_examples: 200
- name: test
num_bytes: 3594005.0
num_examples: 16
- name: val
num_bytes: 1643626.0
num_examples: 7
download_size: 48645908
dataset_size: 51084426.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
pytc/BvEM | ---
license: mit
---
|
tcsenpai/aggregated_captcha_images_and_text | ---
license: cc-by-nc-4.0
---
# Aggregated Captcha Images and Text
## Credits
All the images (not the texts) here contained have been downloaded and selected from various datasets on kaggle.com
### What is this?
This is a dataset containing some hundreds of thousands of images taken from real and used captchas (reCaptcha, hCaptcha and various others) and containing an equally big amount of random 4-8 length texts generated each one in 363 different fonts and with different random noise, size, colors and scratches on them.
While the texts part might result difficult to recognize from the models you could train, the images quantity allows the model to offer a significant possibility of recognization of captcha images.
### Disclaimer
This dataset is NOT intended to break any ToS of any website or to execute malicious, illegal or unethical actions. This dataset is distributed with a purely informative and educative finality, namely the study of the weakness or strength of the current protection systems.
You will for example notice how puzzle based captchas are highly resistant to this kind of analysis. |
mesmalif/amazon-shoe-reviews | ---
dataset_info:
features:
- name: marketplace
dtype: string
- name: customer_id
dtype: string
- name: review_id
dtype: string
- name: product_id
dtype: string
- name: product_parent
dtype: string
- name: product_title
dtype: string
- name: product_category
dtype: string
- name: labels
dtype: int64
- name: helpful_votes
dtype: int64
- name: total_votes
dtype: int64
- name: vine
dtype: int64
- name: verified_purchase
dtype: int64
- name: review_headline
dtype: string
- name: text
dtype: string
- name: review_date
dtype: string
splits:
- name: train
num_bytes: 34784832.6
num_examples: 90000
- name: test
num_bytes: 3864981.4
num_examples: 10000
download_size: 21283157
dataset_size: 38649814.0
---
# Dataset Card for "amazon-shoe-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tnewaz/kd | ---
license: unknown
---
|
open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2 | ---
pretty_name: Evaluation run of paulml/OmniBeagleSquaredMBX-v3-7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [paulml/OmniBeagleSquaredMBX-v3-7B-v2](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T03:04:32.503339](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2/blob/main/results_2024-02-10T03-04-32.503339.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520313155539911,\n\
\ \"acc_stderr\": 0.032055304264286724,\n \"acc_norm\": 0.6510392594733034,\n\
\ \"acc_norm_stderr\": 0.03273146844780618,\n \"mc1\": 0.591187270501836,\n\
\ \"mc1_stderr\": 0.017209952151641724,\n \"mc2\": 0.7292550145611886,\n\
\ \"mc2_stderr\": 0.014624521700190086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927106\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7211710814578769,\n\
\ \"acc_stderr\": 0.004475067344626756,\n \"acc_norm\": 0.8892650866361282,\n\
\ \"acc_norm_stderr\": 0.003131622628199085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473086,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.591187270501836,\n\
\ \"mc1_stderr\": 0.017209952151641724,\n \"mc2\": 0.7292550145611886,\n\
\ \"mc2_stderr\": 0.014624521700190086\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6990144048521607,\n \
\ \"acc_stderr\": 0.01263450446521118\n }\n}\n```"
repo_url: https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|arc:challenge|25_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|gsm8k|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hellaswag|10_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T03-04-32.503339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T03-04-32.503339.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- '**/details_harness|winogrande|5_2024-02-10T03-04-32.503339.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T03-04-32.503339.parquet'
- config_name: results
data_files:
- split: 2024_02_10T03_04_32.503339
path:
- results_2024-02-10T03-04-32.503339.parquet
- split: latest
path:
- results_2024-02-10T03-04-32.503339.parquet
---
# Dataset Card for Evaluation run of paulml/OmniBeagleSquaredMBX-v3-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [paulml/OmniBeagleSquaredMBX-v3-7B-v2](https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T03:04:32.503339](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2/blob/main/results_2024-02-10T03-04-32.503339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520313155539911,
"acc_stderr": 0.032055304264286724,
"acc_norm": 0.6510392594733034,
"acc_norm_stderr": 0.03273146844780618,
"mc1": 0.591187270501836,
"mc1_stderr": 0.017209952151641724,
"mc2": 0.7292550145611886,
"mc2_stderr": 0.014624521700190086
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927106
},
"harness|hellaswag|10": {
"acc": 0.7211710814578769,
"acc_stderr": 0.004475067344626756,
"acc_norm": 0.8892650866361282,
"acc_norm_stderr": 0.003131622628199085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473086,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.591187270501836,
"mc1_stderr": 0.017209952151641724,
"mc2": 0.7292550145611886,
"mc2_stderr": 0.014624521700190086
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079229
},
"harness|gsm8k|5": {
"acc": 0.6990144048521607,
"acc_stderr": 0.01263450446521118
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/GPTeacher_roleplay_standardized_cluster_2_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 422818
num_examples: 537
download_size: 246747
dataset_size: 422818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_roleplay_standardized_cluster_2_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kweimann/poe-learning-layouts | ---
license: mit
---
# Learning layouts in Path of Exile with Vision Transformers: A proof of concept
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/650c55bc9169ea73315b6c22/RJ-rTPWwOFUZlA3ydqhZ2.mp4"></video>
Where's the exit? This question often crosses the minds of both newcomers and seasoned players alike. The key lies in understanding the game's layouts, especially during the campaign when taking a wrong turn can significantly slow you down. Our project aims to solve this challenge through machine learning.
We've developed a proof-of-concept for learning layouts in Path of Exile using Vision Transformers. We trained a Vision Transformer to predict the direction of the exit in the A3 Marketplace, relying solely on a video of the minimap. You can see the model in action in the video above: the red arrow indicates the predicted exit direction, while the green arrow shows the actual direction.
Project page: https://github.com/kweimann/poe-learning-layouts |
ibranze/araproje_mmlu_en_w1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 132093.13725490196
num_examples: 250
download_size: 0
dataset_size: 132093.13725490196
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_en_w1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KHM-hf/myDataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ | ---
pretty_name: Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Airoboros-L2-70B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T02:26:46.433766](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public/blob/main/results_2023-11-08T02-26-46.433766.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4241820469798658,\n\
\ \"em_stderr\": 0.0050612570385902955,\n \"f1\": 0.5410476090604083,\n\
\ \"f1_stderr\": 0.004613044422574753,\n \"acc\": 0.48424459945200166,\n\
\ \"acc_stderr\": 0.010393744134050047\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4241820469798658,\n \"em_stderr\": 0.0050612570385902955,\n\
\ \"f1\": 0.5410476090604083,\n \"f1_stderr\": 0.004613044422574753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15238817285822592,\n \
\ \"acc_stderr\": 0.009899572254794198\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305896\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T02_26_46.433766
path:
- '**/details_harness|drop|3_2023-11-08T02-26-46.433766.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T02-26-46.433766.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T02_26_46.433766
path:
- '**/details_harness|gsm8k|5_2023-11-08T02-26-46.433766.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T02-26-46.433766.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T02_26_46.433766
path:
- '**/details_harness|winogrande|5_2023-11-08T02-26-46.433766.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T02-26-46.433766.parquet'
- config_name: results
data_files:
- split: 2023_11_08T02_26_46.433766
path:
- results_2023-11-08T02-26-46.433766.parquet
- split: latest
path:
- results_2023-11-08T02-26-46.433766.parquet
---
# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Airoboros-L2-70B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T02:26:46.433766](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public/blob/main/results_2023-11-08T02-26-46.433766.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4241820469798658,
"em_stderr": 0.0050612570385902955,
"f1": 0.5410476090604083,
"f1_stderr": 0.004613044422574753,
"acc": 0.48424459945200166,
"acc_stderr": 0.010393744134050047
},
"harness|drop|3": {
"em": 0.4241820469798658,
"em_stderr": 0.0050612570385902955,
"f1": 0.5410476090604083,
"f1_stderr": 0.004613044422574753
},
"harness|gsm8k|5": {
"acc": 0.15238817285822592,
"acc_stderr": 0.009899572254794198
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
orafandina/wiki_long_600k | ---
license: apache-2.0
---
|
yirenlu/heroicons-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4270773.0
num_examples: 292
download_size: 4220476
dataset_size: 4270773.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "heroicons-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yujiepan/awq-model-zoo | ---
tags:
- awq
- llm
- quantization
---
# yujiepan/awq-model-zoo
Here are some pre-computed awq information (scales & clips) used in [llm-awq](https://github.com/mit-han-lab/llm-awq).
## Scripts
- Install the forked `llm-awq` at [https://github.com/yujiepan-work/llm-awq/tree/a41a08e79d8eb3d6335485b3625410af22a74426](https://github.com/yujiepan-work/llm-awq/tree/a41a08e79d8eb3d6335485b3625410af22a74426). Note: works with transformers==4.35.2
- Generating awq-info.pt:
```bash
python do_awq.py --model_id mistralai/Mistral-7B-v0.1 --w_bit 8 --q_group_size 128 --dump_awq ./awq-info.pt
```
- Load a quantized model: You can use the offical repo to get a fake/real quantized model. Alternatively, you can load a fake-quantized model:
```python
from do_awq import FakeAWQModel
FakeAWQModel.from_pretrained('mistralai/Mistral-7B-v0.1', awq_meta_path='./awq-info.pt', output_folder='./tmp/')
```
Note: the code is not in good shape.
## Related links
- <https://huggingface.co/datasets/mit-han-lab/awq-model-zoo>
|
DynamicSuperb/EmotionalSpeechAudioClassification_RAVDESS-EmotionalSound | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 598283894.96
num_examples: 1440
download_size: 325216537
dataset_size: 598283894.96
---
|
Falah/story44kids_1_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3254
num_examples: 10
download_size: 4900
dataset_size: 3254
---
# Dataset Card for "story44kids_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__X-MythoChronos-13B | ---
pretty_name: Evaluation run of Undi95/X-MythoChronos-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/X-MythoChronos-13B](https://huggingface.co/Undi95/X-MythoChronos-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__X-MythoChronos-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T15:55:58.756519](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__X-MythoChronos-13B/blob/main/results_2023-12-09T15-55-58.756519.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5641085013010667,\n\
\ \"acc_stderr\": 0.0335879510752552,\n \"acc_norm\": 0.570142814951906,\n\
\ \"acc_norm_stderr\": 0.03430315611658459,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.535496493693775,\n\
\ \"mc2_stderr\": 0.015937525418247476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216383,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6448914558852819,\n\
\ \"acc_stderr\": 0.004775681871529864,\n \"acc_norm\": 0.8338976299541924,\n\
\ \"acc_norm_stderr\": 0.0037141188843173825\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009798,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009798\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.02686020644472434,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.02686020644472434\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398675,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.488268156424581,\n\
\ \"acc_stderr\": 0.016717897676932162,\n \"acc_norm\": 0.488268156424581,\n\
\ \"acc_norm_stderr\": 0.016717897676932162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037106,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037106\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.535496493693775,\n\
\ \"mc2_stderr\": 0.015937525418247476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22971948445792267,\n \
\ \"acc_stderr\": 0.011586857544997501\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/X-MythoChronos-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-55-58.756519.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- '**/details_harness|winogrande|5_2023-12-09T15-55-58.756519.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T15-55-58.756519.parquet'
- config_name: results
data_files:
- split: 2023_12_09T15_55_58.756519
path:
- results_2023-12-09T15-55-58.756519.parquet
- split: latest
path:
- results_2023-12-09T15-55-58.756519.parquet
---
# Dataset Card for Evaluation run of Undi95/X-MythoChronos-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/X-MythoChronos-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/X-MythoChronos-13B](https://huggingface.co/Undi95/X-MythoChronos-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__X-MythoChronos-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:55:58.756519](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__X-MythoChronos-13B/blob/main/results_2023-12-09T15-55-58.756519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5641085013010667,
"acc_stderr": 0.0335879510752552,
"acc_norm": 0.570142814951906,
"acc_norm_stderr": 0.03430315611658459,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.535496493693775,
"mc2_stderr": 0.015937525418247476
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216383,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.01433223630679015
},
"harness|hellaswag|10": {
"acc": 0.6448914558852819,
"acc_stderr": 0.004775681871529864,
"acc_norm": 0.8338976299541924,
"acc_norm_stderr": 0.0037141188843173825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472434,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472434
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398675,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584194,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584194
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.488268156424581,
"acc_stderr": 0.016717897676932162,
"acc_norm": 0.488268156424581,
"acc_norm_stderr": 0.016717897676932162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510467998,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510467998
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037106,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037106
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.535496493693775,
"mc2_stderr": 0.015937525418247476
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
},
"harness|gsm8k|5": {
"acc": 0.22971948445792267,
"acc_stderr": 0.011586857544997501
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Melanit/testsetjax | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: chunks
list:
- name: text
dtype: string
- name: timestamp
sequence: float64
splits:
- name: example
num_bytes: 5699512.0
num_examples: 10
download_size: 4385109
dataset_size: 5699512.0
configs:
- config_name: default
data_files:
- split: example
path: data/example-*
---
# Dataset Card for "testsetjax"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/facebook_engagement_data | ---
dataset_info:
features:
- name: source_id
dtype: string
- name: source_name
dtype: string
- name: author
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: url
dtype: string
- name: url_to_image
dtype: string
- name: published_at
dtype: string
- name: content
dtype: string
- name: top_article
dtype: float64
- name: engagement_reaction_count
dtype: float64
- name: engagement_comment_count
dtype: float64
- name: engagement_share_count
dtype: float64
- name: engagement_comment_plugin_count
dtype: float64
splits:
- name: train
num_bytes: 1111542
num_examples: 1428
download_size: 622130
dataset_size: 1111542
---
# Dataset Card for "facebook_engagement_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
misterwhisperorg/datasets2 | ---
license: apache-2.0
---
|
CyberHarem/kazemaru_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kazemaru/カゼマル/风丸 (Arknights)
This is the dataset of kazemaru/カゼマル/风丸 (Arknights), containing 44 images and their tags.
The core tags of this character are `animal_ears, cat_ears, breasts, long_hair, earrings, hair_ornament, tail, animal_ear_fluff, purple_eyes, cat_girl, cat_tail, multicolored_hair, hairclip, medium_breasts, braid, grey_hair, maid_headdress`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 85.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazemaru_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 44 | 69.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazemaru_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 116 | 143.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazemaru_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kazemaru_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_gloves, black_dress, elbow_gloves, tongue_out, blush, puffy_short_sleeves, cleavage, holding, maid, twin_braids, smile, cross_earrings, white_pantyhose, blonde_hair, large_breasts, pink_eyes, simple_background |
| 1 | 16 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, simple_background, hairband, open_jacket, single_hair_bun, single_side_bun, white_background, yellow_jacket, bare_shoulders, jewelry, long_sleeves, blush, collarbone, off_shoulder, smile, thighhighs, cat, grey_eyes, holding, shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_gloves | black_dress | elbow_gloves | tongue_out | blush | puffy_short_sleeves | cleavage | holding | maid | twin_braids | smile | cross_earrings | white_pantyhose | blonde_hair | large_breasts | pink_eyes | simple_background | open_mouth | hairband | open_jacket | single_hair_bun | single_side_bun | white_background | yellow_jacket | bare_shoulders | jewelry | long_sleeves | collarbone | off_shoulder | thighhighs | cat | grey_eyes | shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------------|:--------------|:---------------|:-------------|:--------|:----------------------|:-----------|:----------|:-------|:--------------|:--------|:-----------------|:------------------|:--------------|:----------------|:------------|:--------------------|:-------------|:-----------|:--------------|:------------------|:------------------|:-------------------|:----------------|:-----------------|:----------|:---------------|:-------------|:---------------|:-------------|:------|:------------|:---------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | | | | | X | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Thanmay/revised_toxigen-hi | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: toxicity_score
dtype: float64
- name: id
dtype: int64
- name: target_groups
sequence: string
- name: itv2 hi text
dtype: string
splits:
- name: validation
num_bytes: 2482
num_examples: 5
- name: test
num_bytes: 2680072
num_examples: 6509
download_size: 1138333
dataset_size: 2682554
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo | ---
pretty_name: Evaluation run of huseyinatahaninan/phi-2-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huseyinatahaninan/phi-2-dpo](https://huggingface.co/huseyinatahaninan/phi-2-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T21:58:15.192256](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo/blob/main/results_2024-02-12T21-58-15.192256.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5870761708653485,\n\
\ \"acc_stderr\": 0.03369469581974977,\n \"acc_norm\": 0.5884353168964569,\n\
\ \"acc_norm_stderr\": 0.034381836157511524,\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.45354154186159823,\n\
\ \"mc2_stderr\": 0.015221463708711597\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491897\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5765783708424617,\n\
\ \"acc_stderr\": 0.004930911515084782,\n \"acc_norm\": 0.7635929097789285,\n\
\ \"acc_norm_stderr\": 0.004240066898702509\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131796,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415898,\n \"\
acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415898\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186067,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186067\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940784,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478465,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688228,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688228\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.012612974369390973,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.012612974369390973\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468314,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n\
\ \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.45354154186159823,\n\
\ \"mc2_stderr\": 0.015221463708711597\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \
\ \"acc_stderr\": 0.013647916362576054\n }\n}\n```"
repo_url: https://huggingface.co/huseyinatahaninan/phi-2-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|arc:challenge|25_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|gsm8k|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hellaswag|10_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T21-58-15.192256.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- '**/details_harness|winogrande|5_2024-02-12T21-58-15.192256.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T21-58-15.192256.parquet'
- config_name: results
data_files:
- split: 2024_02_12T21_58_15.192256
path:
- results_2024-02-12T21-58-15.192256.parquet
- split: latest
path:
- results_2024-02-12T21-58-15.192256.parquet
---
# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [huseyinatahaninan/phi-2-dpo](https://huggingface.co/huseyinatahaninan/phi-2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T21:58:15.192256](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo/blob/main/results_2024-02-12T21-58-15.192256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5870761708653485,
"acc_stderr": 0.03369469581974977,
"acc_norm": 0.5884353168964569,
"acc_norm_stderr": 0.034381836157511524,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.45354154186159823,
"mc2_stderr": 0.015221463708711597
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491897
},
"harness|hellaswag|10": {
"acc": 0.5765783708424617,
"acc_stderr": 0.004930911515084782,
"acc_norm": 0.7635929097789285,
"acc_norm_stderr": 0.004240066898702509
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131796,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415898,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940784,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03283472056108561,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03283472056108561
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688228,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688228
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859924,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.02753007844711031,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.02753007844711031
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192703,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192703
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390973,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390973
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468314,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.02006287424353913,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.02006287424353913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768924,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768924
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.45354154186159823,
"mc2_stderr": 0.015221463708711597
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
},
"harness|gsm8k|5": {
"acc": 0.5670962850644428,
"acc_stderr": 0.013647916362576054
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PeacefulData/HypoTranslate | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
- zh
- ja
- fr
- es
- it
- pt
tags:
- generative translation
- large language model
- LLaMA
pretty_name: HypoTranslate
size_categories:
- 100K<n<1M
---
This repo releases the HypoTranslate dataset in paper "GenTranslate: Large Language Models are Generative Multilingual Speech and Machine Translators".
If you consider this work would be related or useful for your research, please kindly consider to cite the work below. Thank you.
```bib
@article{hu2024gentranslate,
title={GenTranslate: Large Language Models are Generative Multilingual Speech and Machine Translators},
author={Hu, Yuchen and Chen, Chen and Yang, Chao-Han Huck and Li, Ruizhe and Zhang, Dong and Chen, Zhehuai and Chng, Eng Siong},
journal={arXiv preprint arXiv:2402.06894},
year={2024}
}
``` |
open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-8x7B-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-8x7B-MoE](https://huggingface.co/perlthoughts/Chupacabra-8x7B-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T21:20:16.522598](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE/blob/main/results_2023-12-16T21-20-16.522598.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6415463145761969,\n\
\ \"acc_stderr\": 0.03233222952484684,\n \"acc_norm\": 0.6432118874966731,\n\
\ \"acc_norm_stderr\": 0.03298409149127022,\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6350369384683723,\n\
\ \"mc2_stderr\": 0.01508168993616602\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.01388064457015621,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6757618004381597,\n\
\ \"acc_stderr\": 0.004671328673217797,\n \"acc_norm\": 0.8610834495120494,\n\
\ \"acc_norm_stderr\": 0.003451525868724678\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"\
acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046623,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046623\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799802,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.6350369384683723,\n\
\ \"mc2_stderr\": 0.01508168993616602\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5966641394996209,\n \
\ \"acc_stderr\": 0.013512654781814706\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-8x7B-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-20-16.522598.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-20-16.522598.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- '**/details_harness|winogrande|5_2023-12-16T21-20-16.522598.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T21-20-16.522598.parquet'
- config_name: results
data_files:
- split: 2023_12_16T21_20_16.522598
path:
- results_2023-12-16T21-20-16.522598.parquet
- split: latest
path:
- results_2023-12-16T21-20-16.522598.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-8x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-8x7B-MoE](https://huggingface.co/perlthoughts/Chupacabra-8x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T21:20:16.522598](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE/blob/main/results_2023-12-16T21-20-16.522598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6415463145761969,
"acc_stderr": 0.03233222952484684,
"acc_norm": 0.6432118874966731,
"acc_norm_stderr": 0.03298409149127022,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6350369384683723,
"mc2_stderr": 0.01508168993616602
},
"harness|arc:challenge|25": {
"acc": 0.6561433447098977,
"acc_stderr": 0.01388064457015621,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.6757618004381597,
"acc_stderr": 0.004671328673217797,
"acc_norm": 0.8610834495120494,
"acc_norm_stderr": 0.003451525868724678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.01640712303219525,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.01640712303219525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046623,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046623
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.6350369384683723,
"mc2_stderr": 0.01508168993616602
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
},
"harness|gsm8k|5": {
"acc": 0.5966641394996209,
"acc_stderr": 0.013512654781814706
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_50_1713135229 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 468988
num_examples: 1125
download_size: 245051
dataset_size: 468988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lilacai/lilac-OpenHermes-2.5 | ---
tags:
- Lilac
---
# lilac/OpenHermes-2.5
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-OpenHermes-2.5
```
or from python with:
```py
ll.download("lilacai/lilac-OpenHermes-2.5")
```
|
open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload | ---
pretty_name: Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/llama-30b-2048-instruct-PL-lora_unload](https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T16:55:26.750337](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload/blob/main/results_2023-09-23T16-55-26.750337.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006396812080536913,\n\
\ \"em_stderr\": 0.0008164468837432337,\n \"f1\": 0.09082529362416124,\n\
\ \"f1_stderr\": 0.00181131297042163,\n \"acc\": 0.48843566764183,\n\
\ \"acc_stderr\": 0.010921337573474368\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006396812080536913,\n \"em_stderr\": 0.0008164468837432337,\n\
\ \"f1\": 0.09082529362416124,\n \"f1_stderr\": 0.00181131297042163\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17892342683851403,\n \
\ \"acc_stderr\": 0.010557661392901289\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047448\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T16_55_26.750337
path:
- '**/details_harness|drop|3_2023-09-23T16-55-26.750337.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T16-55-26.750337.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T16_55_26.750337
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-55-26.750337.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-55-26.750337.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:59:52.848491.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:59:52.848491.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T16_55_26.750337
path:
- '**/details_harness|winogrande|5_2023-09-23T16-55-26.750337.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T16-55-26.750337.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_59_52.848491
path:
- results_2023-08-09T15:59:52.848491.parquet
- split: 2023_09_23T16_55_26.750337
path:
- results_2023-09-23T16-55-26.750337.parquet
- split: latest
path:
- results_2023-09-23T16-55-26.750337.parquet
---
# Dataset Card for Evaluation run of Aspik101/llama-30b-2048-instruct-PL-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/llama-30b-2048-instruct-PL-lora_unload](https://huggingface.co/Aspik101/llama-30b-2048-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T16:55:26.750337](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__llama-30b-2048-instruct-PL-lora_unload/blob/main/results_2023-09-23T16-55-26.750337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432337,
"f1": 0.09082529362416124,
"f1_stderr": 0.00181131297042163,
"acc": 0.48843566764183,
"acc_stderr": 0.010921337573474368
},
"harness|drop|3": {
"em": 0.006396812080536913,
"em_stderr": 0.0008164468837432337,
"f1": 0.09082529362416124,
"f1_stderr": 0.00181131297042163
},
"harness|gsm8k|5": {
"acc": 0.17892342683851403,
"acc_stderr": 0.010557661392901289
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
smanna/indian_constitution_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1765635
num_examples: 6510
download_size: 697957
dataset_size: 1765635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- table-question-answering
- question-answering
- text-generation
- fill-mask
- feature-extraction
language:
- en
tags:
- legal
pretty_name: complex_const
--- |
aviroes/above_70yo_elderly_people_other_dataset | ---
configs:
- config_name: default
data_files:
- split: other
path: data/other-*
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
splits:
- name: other
num_bytes: 116941.34140285537
num_examples: 2
download_size: 124504
dataset_size: 116941.34140285537
---
# Dataset Card for "above_70yo_elderly_people_other_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_en_conf_gpt_bestscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 81196
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_conf_gpt_bestscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeCoders/Doctor_mini | ---
license: llama2
---
|
shibing624/alpaca-zh | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 32150579
num_examples: 48818
download_size: 35100559
dataset_size: 32150579
license: cc-by-4.0
language:
- zh
pretty_name: Instruction Tuning with GPT-4
size_categories:
- 10K<n<100K
task_categories:
- text-generation
tags:
- gpt
- alpaca
- fine-tune
- instruct-tune
- instruction
---
# Dataset Description
- **Project Page:** https://instruction-tuning-with-gpt-4.github.io
- **Repo:** https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
- **Paper:** https://arxiv.org/abs/2304.03277
# Dataset Card for "alpaca-zh"
本数据集是参考Alpaca方法基于GPT4得到的self-instruct数据,约5万条。
Dataset from https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
It is the chinese dataset from https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM/blob/main/data/alpaca_gpt4_data_zh.json
# Usage and License Notices
The data is intended and licensed for research use only. The dataset is CC BY NC 4.0 (allowing only non-commercial use) and models trained using the dataset should not be used outside of research purposes.
train model with alpaca-zh dataset: https://github.com/shibing624/textgen
# English Dataset
[Found here](https://huggingface.co/datasets/c-s-ale/alpaca-gpt4-data)
# Citation
```
@article{peng2023gpt4llm,
title={Instruction Tuning with GPT-4},
author={Baolin Peng, Chunyuan Li, Pengcheng He, Michel Galley, Jianfeng Gao},
journal={arXiv preprint arXiv:2304.03277},
year={2023}
}
``` |
Rahulrayudu/Farm_QA_Dataset_inst | ---
dataset_info:
features:
- name: Crop
dtype: string
- name: Label
dtype: string
- name: Question
dtype: string
- name: Answer
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 629733
num_examples: 839
download_size: 254469
dataset_size: 629733
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Thefoodprocessor/recipes | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: recipe
dtype: string
splits:
- name: train
num_bytes: 105767040
num_examples: 74465
download_size: 53711472
dataset_size: 105767040
---
# Dataset Card for "recipes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chungimungi/pubmed | ---
task_categories:
- text-classification
- table-question-answering
- token-classification
- question-answering
- zero-shot-classification
- feature-extraction
- text-generation
- text2text-generation
- sentence-similarity
language:
- en
tags:
- medical
pretty_name: 'y'
---
# PubMed dataset in raw XML.
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Once a year, NLM produces a baseline set of PubMed citation records in XML format for download; the baseline file is a complete snapshot of PubMed data. When using this data in a local database, the best practice is to overwrite your local data each year with the baseline data.
## Dataset Structure
XML
### Source Data
https://ftp.ncbi.nlm.nih.gov/pubmed/baseline/
|
CyberHarem/usami_sumireko_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of usami_sumireko/宇佐見菫子 (Touhou)
This is the dataset of usami_sumireko/宇佐見菫子 (Touhou), containing 500 images and their tags.
The core tags of this character are `brown_hair, glasses, brown_eyes, red-framed_eyewear, hat, twintails, bow, low_twintails, short_hair, hat_bow, semi-rimless_eyewear, under-rim_eyewear, black_headwear, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 563.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usami_sumireko_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 360.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usami_sumireko_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1103 | 720.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usami_sumireko_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 512.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usami_sumireko_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1103 | 955.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usami_sumireko_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/usami_sumireko_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, nipples, solo, blush, looking_at_viewer, large_breasts, sweat, navel, open_mouth, smile, no_bra, open_shirt, plaid, simple_background, skirt, underwear |
| 1 | 12 |  |  |  |  |  | 1girl, cape, clothes_writing, plaid, skirt, smile, solo, shirt, long_sleeves, open_mouth, school_uniform, looking_at_viewer, gloves |
| 2 | 16 |  |  |  |  |  | 1girl, long_sleeves, plaid_skirt, plaid_vest, solo, purple_skirt, smile, looking_at_viewer, purple_vest, shoes, white_socks, full_body, kneehighs, cloak, runes, white_gloves, white_shirt, black_footwear, clothes_writing, black_cape, closed_mouth, open_mouth, white_bow, card |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | nipples | solo | blush | looking_at_viewer | large_breasts | sweat | navel | open_mouth | smile | no_bra | open_shirt | plaid | simple_background | skirt | underwear | cape | clothes_writing | shirt | long_sleeves | school_uniform | gloves | plaid_skirt | plaid_vest | purple_skirt | purple_vest | shoes | white_socks | full_body | kneehighs | cloak | runes | white_gloves | white_shirt | black_footwear | black_cape | closed_mouth | white_bow | card |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:--------|:--------------------|:----------------|:--------|:--------|:-------------|:--------|:---------|:-------------|:--------|:--------------------|:--------|:------------|:-------|:------------------|:--------|:---------------|:-----------------|:---------|:--------------|:-------------|:---------------|:--------------|:--------|:--------------|:------------|:------------|:--------|:--------|:---------------|:--------------|:-----------------|:-------------|:---------------|:------------|:-------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | | X | | | | X | X | | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | | X | | X | | | | X | X | | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/webley_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of webley/ウェブリー/韦伯利 (Girls' Frontline)
This is the dataset of webley/ウェブリー/韦伯利 (Girls' Frontline), containing 32 images and their tags.
The core tags of this character are `blue_eyes, bangs, brown_hair, ribbon, short_hair, bow, hair_between_eyes, hair_bow, two_side_up`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 32 | 46.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/webley_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 32 | 24.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/webley_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 79 | 51.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/webley_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 32 | 39.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/webley_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 79 | 75.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/webley_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/webley_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 32 |  |  |  |  |  | 1girl, looking_at_viewer, solo, long_sleeves, blush, red_cape, epaulettes, frills, handgun, closed_mouth, holding_gun, revolver, simple_background, white_dress, white_background, white_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | long_sleeves | blush | red_cape | epaulettes | frills | handgun | closed_mouth | holding_gun | revolver | simple_background | white_dress | white_background | white_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------|:-----------|:-------------|:---------|:----------|:---------------|:--------------|:-----------|:--------------------|:--------------|:-------------------|:------------------|
| 0 | 32 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mayuzumi_fuyuko/黛冬優子 (THE iDOLM@STER: SHINY COLORS)
This is the dataset of mayuzumi_fuyuko/黛冬優子 (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `black_hair, long_hair, bangs, brown_eyes, blunt_bangs, breasts, two_side_up, ribbon, medium_breasts, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 919.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 439.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1302 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 777.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1302 | 1.59 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mayuzumi_fuyuko_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, cat_ears, looking_at_viewer, solo, cat_tail, animal_ear_fluff, blush, open_mouth, frills, red_bow, simple_background, hair_ribbon, hairclip, white_background, fang, black_choker, bowtie, cat_girl, dress, juliet_sleeves, white_shirt, claw_pose, vertical-striped_skirt |
| 1 | 6 |  |  |  |  |  | 1girl, black_ribbon, black_skirt, black_thighhighs, blush, long_sleeves, looking_at_viewer, pink_shirt, simple_background, solo, white_background, :d, neck_ribbon, open_mouth, zettai_ryouiki, frills, jirai_kei |
| 2 | 8 |  |  |  |  |  | 1girl, black_ribbon, black_skirt, blush, jirai_kei, long_sleeves, looking_at_viewer, pink_shirt, solo, neck_ribbon, simple_background, white_background, closed_mouth, frills, crossed_arms, upper_body |
| 3 | 13 |  |  |  |  |  | 1girl, bare_shoulders, long_sleeves, looking_at_viewer, plaid_dress, solo, black_choker, collarbone, blush, simple_background, white_background, detached_sleeves, frilled_dress, smile, upper_body, white_dress, grey_dress |
| 4 | 20 |  |  |  |  |  | 1girl, black_headwear, black_shirt, long_sleeves, looking_at_viewer, solo, heart_earrings, cleavage, blush, polka_dot_legwear, zettai_ryouiki, beret, leopard_print, fake_horns, collarbone, miniskirt, simple_background, sitting, white_background, brown_skirt, open_mouth, frilled_choker, :d, pink_thighhighs, print_skirt, shiny_hair, very_long_hair |
| 5 | 5 |  |  |  |  |  | 1girl, blush, long_sleeves, looking_at_viewer, smile, solo, cherry_blossoms, floating_hair, outdoors, upper_body, belt, hair_ribbon, open_mouth, pink_jacket, skirt, depth_of_field, falling_petals, frills, neck_ribbon, white_shirt |
| 6 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, earrings, nail_polish, hair_bow, long_sleeves, smile, chocolate, collarbone, holding, one_eye_closed, open_mouth, shirt, upper_body, blue_nails |
| 7 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, school_uniform, solo, plaid_skirt, pleated_skirt, white_shirt, cardigan_around_waist, double_bun, kogal, loose_bowtie, pink_skirt, simple_background, wavy_hair, white_background, blush, bracelet, smile, open_collar, collarbone, open_mouth, school_bag, sitting, sweater_around_waist, wrist_scrunchie |
| 8 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, crop_top, detached_sleeves, green_hair, midriff, miniskirt, navel, smile, two-tone_hair, cleavage, nail_polish, black_thighhighs, green_nails, long_sleeves, pleated_skirt, black_shirt, cowboy_shot, layered_skirt, open_mouth, stomach |
| 9 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, navel, see-through, solo, earrings, hair_flower, midriff, black_gloves, bare_shoulders, black_rose, blush, smile, black_bikini, bracelet, cleavage, collarbone, crop_top, lying, stomach |
| 10 | 24 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, blush, red_bikini, heart_print, collarbone, frilled_bikini, smile, navel, hair_ribbon, off-shoulder_bikini, open_mouth, bikini_skirt, outdoors, blue_sky, day, print_bikini, hair_bow, red_bow, white_background, bare_shoulders, black_choker, bracelet, ocean |
| 11 | 6 |  |  |  |  |  | 1girl, choker, looking_at_viewer, sailor_collar, solo, cat_ear_headphones, heart, shirt, short_sleeves, bag, blush, earrings, open_mouth, purple_skirt, star_(symbol), :d, bracelet, frilled_skirt, hairclip, pleated_skirt, scrunchie, white_thighhighs |
| 12 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, sun_hat, white_headwear, :d, blush, open_mouth, white_dress, blue_ribbon, blue_sky, hand_on_headwear, outdoors, sleeveless_dress, arm_up, day, floral_print, hat_ribbon, jewelry, white_belt |
| 13 | 7 |  |  |  |  |  | 1girl, bare_shoulders, solo, blush, casual_one-piece_swimsuit, heart_hair_ornament, looking_at_viewer, twin_braids, choker, collarbone, frilled_swimsuit, hairclip, cleavage, closed_mouth, heart_print, outdoors, twintails, bare_legs, barefoot, beach, blue_dress, blue_one-piece_swimsuit, blue_sky, cloud, day, hair_over_shoulder, heart_cutout, off_shoulder, pink_bow, smile, wariza |
| 14 | 5 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, looking_at_viewer, solo, bare_shoulders, closed_mouth, navel, underwear_only, lace-trimmed_bra, on_bed, ass_visible_through_thighs, bed_sheet, blurry, curtains, indoors, lace-trimmed_panties, pillow, pink_panties, red_bra, red_panties, sitting, smile, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cat_ears | looking_at_viewer | solo | cat_tail | animal_ear_fluff | blush | open_mouth | frills | red_bow | simple_background | hair_ribbon | hairclip | white_background | fang | black_choker | bowtie | cat_girl | dress | juliet_sleeves | white_shirt | claw_pose | vertical-striped_skirt | black_ribbon | black_skirt | black_thighhighs | long_sleeves | pink_shirt | :d | neck_ribbon | zettai_ryouiki | jirai_kei | closed_mouth | crossed_arms | upper_body | bare_shoulders | plaid_dress | collarbone | detached_sleeves | frilled_dress | smile | white_dress | grey_dress | black_headwear | black_shirt | heart_earrings | cleavage | polka_dot_legwear | beret | leopard_print | fake_horns | miniskirt | sitting | brown_skirt | frilled_choker | pink_thighhighs | print_skirt | shiny_hair | very_long_hair | cherry_blossoms | floating_hair | outdoors | belt | pink_jacket | skirt | depth_of_field | falling_petals | earrings | nail_polish | hair_bow | chocolate | holding | one_eye_closed | shirt | blue_nails | school_uniform | plaid_skirt | pleated_skirt | cardigan_around_waist | double_bun | kogal | loose_bowtie | pink_skirt | wavy_hair | bracelet | open_collar | school_bag | sweater_around_waist | wrist_scrunchie | crop_top | green_hair | midriff | navel | two-tone_hair | green_nails | cowboy_shot | layered_skirt | stomach | see-through | hair_flower | black_gloves | black_rose | black_bikini | lying | red_bikini | heart_print | frilled_bikini | off-shoulder_bikini | bikini_skirt | blue_sky | day | print_bikini | ocean | choker | sailor_collar | cat_ear_headphones | heart | short_sleeves | bag | purple_skirt | star_(symbol) | frilled_skirt | scrunchie | white_thighhighs | sun_hat | white_headwear | blue_ribbon | hand_on_headwear | sleeveless_dress | arm_up | floral_print | hat_ribbon | jewelry | white_belt | casual_one-piece_swimsuit | heart_hair_ornament | twin_braids | frilled_swimsuit | twintails | bare_legs | barefoot | beach | blue_dress | blue_one-piece_swimsuit | cloud | hair_over_shoulder | heart_cutout | off_shoulder | pink_bow | wariza | underwear_only | lace-trimmed_bra | on_bed | ass_visible_through_thighs | bed_sheet | blurry | curtains | indoors | lace-trimmed_panties | pillow | pink_panties | red_bra | red_panties |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------|:--------------------|:-------|:-----------|:-------------------|:--------|:-------------|:---------|:----------|:--------------------|:--------------|:-----------|:-------------------|:-------|:---------------|:---------|:-----------|:--------|:-----------------|:--------------|:------------|:-------------------------|:---------------|:--------------|:-------------------|:---------------|:-------------|:-----|:--------------|:-----------------|:------------|:---------------|:---------------|:-------------|:-----------------|:--------------|:-------------|:-------------------|:----------------|:--------|:--------------|:-------------|:-----------------|:--------------|:-----------------|:-----------|:--------------------|:--------|:----------------|:-------------|:------------|:----------|:--------------|:-----------------|:------------------|:--------------|:-------------|:-----------------|:------------------|:----------------|:-----------|:-------|:--------------|:--------|:-----------------|:-----------------|:-----------|:--------------|:-----------|:------------|:----------|:-----------------|:--------|:-------------|:-----------------|:--------------|:----------------|:------------------------|:-------------|:--------|:---------------|:-------------|:------------|:-----------|:--------------|:-------------|:-----------------------|:------------------|:-----------|:-------------|:----------|:--------|:----------------|:--------------|:--------------|:----------------|:----------|:--------------|:--------------|:---------------|:-------------|:---------------|:--------|:-------------|:--------------|:-----------------|:----------------------|:---------------|:-----------|:------|:---------------|:--------|:---------|:----------------|:---------------------|:--------|:----------------|:------|:---------------|:----------------|:----------------|:------------|:-------------------|:----------|:-----------------|:--------------|:-------------------|:-------------------|:---------|:---------------|:-------------|:----------|:-------------|:----------------------------|:----------------------|:--------------|:-------------------|:------------|:------------|:-----------|:--------|:-------------|:--------------------------|:--------|:---------------------|:---------------|:---------------|:-----------|:---------|:-----------------|:-------------------|:---------|:-----------------------------|:------------|:---------|:-----------|:----------|:-----------------------|:---------|:---------------|:----------|:--------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | | | X | X | X | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | X | | | X | | X | | X | | | X | | | | | | | | | | X | X | | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | X | X | | | X | | | | X | | | X | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | | X | X | | | X | X | | | X | | | X | | | | | | | | | | | | | X | | X | | X | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | | | X | X | X | | | X | | | | | | | | | X | | | | | | X | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | X | X | | | X | X | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | | X | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 10 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 24 |  |  |  |  |  | X | | X | X | | | X | X | | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | | X | X | | | X | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | X | | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 14 | 5 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
huggingartists/billie-eilish | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/billie-eilish"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.734139 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/1aa6c04aad3652556046bb3aabe96498.900x900x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/billie-eilish">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Billie Eilish</div>
<a href="https://genius.com/artists/billie-eilish">
<div style="text-align: center; font-size: 14px;">@billie-eilish</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/billie-eilish).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/billie-eilish")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|298| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/billie-eilish")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
liaad/Bosque_PT-PT | ---
license: mit
dataset_info:
features:
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: pos_tags
sequence: string
splits:
- name: train
num_bytes: 5033815
num_examples: 9071
- name: test
num_bytes: 286364
num_examples: 576
download_size: 1758940
dataset_size: 5320179
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- token-classification
language:
- pt
tags:
- pos
- pos-tagging
- part-of-speech
pretty_name: Bosque Part of Speech PT-PT
--- |
Seongill/squad_conflict_v2_under_150_with_substitution | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float64
- name: ent_type
dtype: string
- name: answer
dtype: string
- name: random_answer
dtype: string
- name: similar_answer
dtype: string
- name: rewritten_context
dtype: string
- name: has_answer
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 199608048
num_examples: 25866
download_size: 140606479
dataset_size: 199608048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
malaysia-ai/mosaic-yi | ---
language:
- ms
---
# Mosaic format for filtered combine dataset to finetune Yi models
This repository is to store dataset shards using mosaic format.
1. prepared at https://github.com/malaysia-ai/dedup-text-dataset/blob/main/yi/combine-dataset.ipynb
2. using tokenizer https://huggingface.co/01-ai/Yi-6B
3. 4096 context length.
## how-to
1. git clone,
```bash
git lfs clone https://huggingface.co/datasets/malaysia-ai/mosaic-yi
```
2. load it,
```python
from streaming import LocalDataset
import numpy as np
from streaming.base.format.mds.encodings import Encoding, _encodings
class UInt16(Encoding):
def encode(self, obj) -> bytes:
return obj.tobytes()
def decode(self, data: bytes):
return np.frombuffer(data, np.uint16)
_encodings['uint16'] = UInt16
dataset = LocalDataset('mosaic-yi')
len(dataset)
``` |
vedant2004/Airlinereview | ---
license: apache-2.0
---
|
islamrokon/Dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 379777.2888616891
num_examples: 735
- name: test
num_bytes: 42369.71113831089
num_examples: 82
download_size: 165978
dataset_size: 422147.0
---
# Dataset Card for "Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/PMC-VQA | Invalid username or password. |
CyberHarem/kubira_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kubira (Granblue Fantasy)
This is the dataset of kubira (Granblue Fantasy), containing 306 images and their tags.
The core tags of this character are `dark_skin, blonde_hair, horns, long_hair, dark-skinned_female, pointy_ears, breasts, large_breasts, bangs, yellow_eyes, horn_ornament, multicolored_hair, pink_hair, brown_eyes, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 306 | 456.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 306 | 260.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 762 | 562.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 306 | 401.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 762 | 804.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kubira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kubira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1boy, 1girl, blush, draph, hetero, looking_at_viewer, nipples, solo_focus, paizuri, smile, breasts_squeezed_together, open_mouth, penis, collarbone, cum_on_breasts, two-tone_hair, censored, hair_flower, horn_ribbon, jewelry |
| 1 | 44 |  |  |  |  |  | 1girl, draph, looking_at_viewer, solo, smile, black_bikini, official_alternate_costume, blush, cleavage, bare_shoulders, hair_flower, horn_ribbon, layered_bikini, navel, two-tone_hair, parted_bangs |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, black_shorts, cleavage, draph, fur_trim, looking_at_viewer, midriff, navel, off_shoulder, short_shorts, smile, solo, wide_sleeves, blush, collarbone, elbow_gloves, long_sleeves, necklace, open_mouth, thighs, white_gloves, white_thighhighs, gourd, jacket, parted_bangs, sidelocks, simple_background, white_background, cowboy_shot, swept_bangs |
| 3 | 5 |  |  |  |  |  | 1girl, draph, smile, solo, blush, coat, long_sleeves, ribbon, sweater, upper_body, boar, jewelry, looking_at_viewer, red_scarf, snow |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, draph, hetero, navel, nipples, sex, solo_focus, sweat, parted_bangs, penis, pussy, spread_legs, vaginal, bar_censor, bed_sheet, colored_inner_hair, completely_nude, female_pubic_hair, missionary, on_back, open_mouth, two-tone_hair, cum, horn_ribbon, nose_blush, on_bed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | draph | hetero | looking_at_viewer | nipples | solo_focus | paizuri | smile | breasts_squeezed_together | open_mouth | penis | collarbone | cum_on_breasts | two-tone_hair | censored | hair_flower | horn_ribbon | jewelry | solo | black_bikini | official_alternate_costume | cleavage | bare_shoulders | layered_bikini | navel | parted_bangs | belt | black_shorts | fur_trim | midriff | off_shoulder | short_shorts | wide_sleeves | elbow_gloves | long_sleeves | necklace | thighs | white_gloves | white_thighhighs | gourd | jacket | sidelocks | simple_background | white_background | cowboy_shot | swept_bangs | coat | ribbon | sweater | upper_body | boar | red_scarf | snow | sex | sweat | pussy | spread_legs | vaginal | bar_censor | bed_sheet | colored_inner_hair | completely_nude | female_pubic_hair | missionary | on_back | cum | nose_blush | on_bed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:--------|:---------|:--------------------|:----------|:-------------|:----------|:--------|:----------------------------|:-------------|:--------|:-------------|:-----------------|:----------------|:-----------|:--------------|:--------------|:----------|:-------|:---------------|:-----------------------------|:-----------|:-----------------|:-----------------|:--------|:---------------|:-------|:---------------|:-----------|:----------|:---------------|:---------------|:---------------|:---------------|:---------------|:-----------|:---------|:---------------|:-------------------|:--------|:---------|:------------|:--------------------|:-------------------|:--------------|:--------------|:-------|:---------|:----------|:-------------|:-------|:------------|:-------|:------|:--------|:--------|:--------------|:----------|:-------------|:------------|:---------------------|:------------------|:--------------------|:-------------|:----------|:------|:-------------|:---------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 44 |  |  |  |  |  | | X | X | X | | X | | | | X | | | | | | X | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | | X | X | X | | X | | | | X | | X | | X | | | | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | X | X | X | | X | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | X | | X | X | | | | X | X | | | X | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
one-sec-cv12/chunk_61 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24301968912.625
num_examples: 253019
download_size: 21418708481
dataset_size: 24301968912.625
---
# Dataset Card for "chunk_61"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gokul00060/armv1-jsonl | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_stsb_he_inanimate_objects | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13287
num_examples: 61
- name: test
num_bytes: 7538
num_examples: 43
- name: train
num_bytes: 25309
num_examples: 119
download_size: 40508
dataset_size: 46134
---
# Dataset Card for "MULTI_VALUE_stsb_he_inanimate_objects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kings-crown/Aircraft_Reports | ---
license: mit
---
|
MattiaL/tapir-cleaned-67k | ---
license: cc-by-nc-4.0
language:
- en
tags:
- instruction-finetuning
pretty_name: Tapir-Cleaned
task_categories:
- text-generation
size_categories:
- 10K<n<100K
---
# Dataset Card for Tapir-Cleaned
This is a revised version of the DAISLab dataset of IFTTT rules, which has been thoroughly cleaned, scored, and adjusted for the purpose of instruction-tuning.
## Tapir Dataset Summary
Tapir is a subset of the larger DAISLab dataset, which comprises 242,480 recipes extracted from the IFTTT platform.
After a thorough cleaning process that involved the removal of redundant and inconsistent recipes, the refined dataset was condensed to include 67,697 high-quality recipes.
This curated set of instruction data is particularly useful for conducting instruction-tuning exercises for language models,
allowing them to more accurately follow instructions and achieve superior performance.
The last version of Tapir includes a correlation score that helps to identify the most appropriate description-rule pairs for instruction tuning.
Description-rule pairs with a score greater than 0.75 are deemed good enough and are prioritized for further analysis and tuning.
### Supported Tasks and Leaderboards
The Tapir dataset designed for instruction training pretrained language models
### Languages
The data in Tapir are mainly in English (BCP-47 en).
# Dataset Structure
### Data Instances
```json
{
"instruction":"From the description of a rule: identify the 'trigger', identify the 'action', write a IF 'trigger' THEN 'action' rule.",
"input":"If it's raining outside, you'll want some nice warm colors inside!",
"output":"IF Weather Underground Current condition changes to THEN LIFX Change color of lights",
"score":"0.788197",
"text": "Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nFrom the description of a rule: identify the 'trigger', identify the 'action', write a IF 'trigger' THEN 'action' rule.\n\n### Input:\nIf it's raining outside, you'll want some nice warm colors inside!\n\n### Response:\nIF Weather Underground Current condition changes to THEN LIFX Change color of lights",
}
```
### Data Fields
The data fields are as follows:
* `instruction`: describes the task the model should perform.
* `input`: context or input for the task. Each of the 67K input is unique.
* `output`: the answer taken from the original Tapir Dataset formatted as an IFTTT recipe.
* `score`: the correlation score obtained via BertForNextSentencePrediction
* `text`: the `instruction`, `input` and `output` formatted with the [prompt template](https://github.com/tatsu-lab/stanford_alpaca#data-release) used by the authors of Alpaca for fine-tuning their models.
### Data Splits
| | train |
|---------------|------:|
| tapir | 67697 |
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@misc{tapir,
author = {Mattia Limone, Gaetano Cimino, Annunziata Elefante},
title = {TAPIR: Trigger Action Platform for Information Retrieval},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/MattiaLimone/ifttt_recommendation_system}},
}
``` |
autoevaluate/autoeval-staging-eval-project-c80bd5f3-aba9-44d4-aefd-7fef2e67a535-120116 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/zero-shot-classification-sample
eval_info:
task: text_zero_shot_classification
model: autoevaluate/zero-shot-classification-not-evaluated
metrics: []
dataset_name: autoevaluate/zero-shot-classification-sample
dataset_config: autoevaluate--zero-shot-classification-sample
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: autoevaluate/zero-shot-classification-not-evaluated
* Dataset: autoevaluate/zero-shot-classification-sample
* Config: autoevaluate--zero-shot-classification-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
CyberHarem/sena_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sena/氷室セナ/濑名 (Blue Archive)
This is the dataset of sena/氷室セナ/濑名 (Blue Archive), containing 92 images and their tags.
The core tags of this character are `horns, short_hair, halo, yellow_eyes, braid, breasts, hat, white_hair, nurse_cap, large_breasts, black_horns, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 153.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sena_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 92 | 122.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sena_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 227 | 255.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sena_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sena_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | armband, blue_dress, closed_mouth, looking_at_viewer, short_sleeves, solo, 1girl, blush, nurse, simple_background, white_apron, hair_between_eyes, puffy_sleeves, white_background, white_headwear, black_gloves |
| 1 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, blue_dress, closed_mouth, full_body, puffy_short_sleeves, solo, white_apron, knee_boots, looking_at_viewer, standing, waist_apron, white_background, white_headwear, armband, chibi, hair_between_eyes, holding_gun, lace-up_boots, simple_background, brown_eyes, brown_footwear, medium_breasts, shoulder_bag |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | armband | blue_dress | closed_mouth | looking_at_viewer | short_sleeves | solo | 1girl | blush | nurse | simple_background | white_apron | hair_between_eyes | puffy_sleeves | white_background | white_headwear | black_gloves | black_footwear | full_body | puffy_short_sleeves | knee_boots | standing | waist_apron | chibi | holding_gun | lace-up_boots | brown_eyes | brown_footwear | medium_breasts | shoulder_bag |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------|:-------------|:---------------|:--------------------|:----------------|:-------|:--------|:--------|:--------|:--------------------|:--------------|:--------------------|:----------------|:-------------------|:-----------------|:---------------|:-----------------|:------------|:----------------------|:-------------|:-----------|:--------------|:--------|:--------------|:----------------|:-------------|:-----------------|:-----------------|:---------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/shatola_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shatola (Granblue Fantasy)
This is the dataset of shatola (Granblue Fantasy), containing 367 images and their tags.
The core tags of this character are `long_hair, animal_ears, blue_hair, breasts, horns, cow_ears, bangs, cow_horns, cow_girl, large_breasts, pointy_ears, ear_piercing`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 367 | 580.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 367 | 308.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 951 | 691.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 367 | 502.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 951 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shatola_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shatola_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, draph, looking_at_viewer, red_dress, solo, blush, cleavage, smile, bell, bare_shoulders, open_mouth, bow, fur_collar, twintails, yellow_eyes, cow_print |
| 1 | 21 |  |  |  |  |  | 1girl, cleavage, cow_print, draph, looking_at_viewer, piercing, solo, bare_shoulders, blush, detached_sleeves, white_bikini, detached_collar, see-through, wide_sleeves, open_mouth, purple_eyes |
| 2 | 8 |  |  |  |  |  | 1girl, blush, cleavage, cow_print, detached_sleeves, draph, looking_at_viewer, piercing, solo, thighs, white_bikini, white_thighhighs, bare_shoulders, detached_collar, navel, see-through, short_shorts, sitting, white_shorts, wide_sleeves, purple_eyes, open_mouth, white_background, simple_background |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, cow_print, cow_tail, detached_collar, detached_sleeves, draph, looking_at_viewer, navel, piercing, short_shorts, solo, thighs, white_bikini, white_shorts, white_thighhighs, wide_sleeves, blush, micro_shorts, open_mouth |
| 4 | 9 |  |  |  |  |  | 1boy, 1girl, blush, cow_print, draph, hetero, nipples, paizuri, solo_focus, penis, piercing, earrings, looking_at_viewer, open_mouth, collarbone, detached_collar, bar_censor, detached_sleeves, huge_breasts, mosaic_censoring, purple_eyes, smile |
| 5 | 12 |  |  |  |  |  | 1girl, blush, cow_print, hetero, penis, sex, vaginal, 1boy, draph, navel, open_mouth, solo_focus, nipples, thighhighs, piercing, cum_in_pussy, girl_on_top, nude, bare_shoulders, cowgirl_position, detached_sleeves, looking_at_viewer, mosaic_censoring, bar_censor, smile, thighs |
| 6 | 5 |  |  |  |  |  | blush, draph, looking_at_viewer, onsen, 1girl, collarbone, night_sky, solo, towel_on_head, wet, naked_towel, open_mouth, sitting, smile, star_(sky), bare_shoulders, bathing, cleavage, completely_nude, huge_breasts, navel, nude_cover, steam_censor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | looking_at_viewer | red_dress | solo | blush | cleavage | smile | bell | bare_shoulders | open_mouth | bow | fur_collar | twintails | yellow_eyes | cow_print | piercing | detached_sleeves | white_bikini | detached_collar | see-through | wide_sleeves | purple_eyes | thighs | white_thighhighs | navel | short_shorts | sitting | white_shorts | white_background | simple_background | cow_tail | micro_shorts | 1boy | hetero | nipples | paizuri | solo_focus | penis | earrings | collarbone | bar_censor | huge_breasts | mosaic_censoring | sex | vaginal | thighhighs | cum_in_pussy | girl_on_top | nude | cowgirl_position | onsen | night_sky | towel_on_head | wet | naked_towel | star_(sky) | bathing | completely_nude | nude_cover | steam_censor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:------------|:-------|:--------|:-----------|:--------|:-------|:-----------------|:-------------|:------|:-------------|:------------|:--------------|:------------|:-----------|:-------------------|:---------------|:------------------|:--------------|:---------------|:--------------|:---------|:-------------------|:--------|:---------------|:----------|:---------------|:-------------------|:--------------------|:-----------|:---------------|:-------|:---------|:----------|:----------|:-------------|:--------|:-----------|:-------------|:-------------|:---------------|:-------------------|:------|:----------|:-------------|:---------------|:--------------|:-------|:-------------------|:--------|:------------|:----------------|:------|:--------------|:-------------|:----------|:------------------|:-------------|:---------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | X | X | X | X | X | | X | | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | | | X | | X | | | X | | | | | X | X | X | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | X | X | | | X | | X | | X | X | | | | | X | X | X | | | | | | X | | X | | | | | | | | X | X | X | | X | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
solosolipsist/bibliotheque-mordecai-richler | ---
license: mit
---
|
nabonator/toy_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 181444.0
num_examples: 10
download_size: 167366
dataset_size: 181444.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
japanese-asr/whisper_transcriptions.reazonspeech.all_55 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30363885386.0
num_examples: 267370
download_size: 30127324216
dataset_size: 30363885386.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
j-chim/pii-pile-chunk3-300000-350000-tagged | ---
dataset_info:
features:
- name: texts
sequence: string
- name: meta
struct:
- name: pile_set_name
dtype: string
- name: scores
sequence: float64
- name: avg_score
dtype: float64
- name: num_sents
dtype: int64
- name: tagged_pii_results
list:
- name: analysis_explanation
dtype: 'null'
- name: end
dtype: int64
- name: entity_type
dtype: string
- name: recognition_metadata
struct:
- name: recognizer_identifier
dtype: string
- name: recognizer_name
dtype: string
- name: score
dtype: float64
- name: start
dtype: int64
splits:
- name: train
num_bytes: 510432044
num_examples: 50000
download_size: 194469001
dataset_size: 510432044
---
# Dataset Card for "pii-pile-chunk3-300000-350000-tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/centi_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of centi/センチ/桑迪/센티 (Nikke: Goddess of Victory)
This is the dataset of centi/センチ/桑迪/센티 (Nikke: Goddess of Victory), containing 23 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, long_hair, bangs, breasts, hat, large_breasts, baseball_cap, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 47.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centi_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 19.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centi_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 44.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centi_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 37.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centi_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 74.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/centi_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/centi_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, belt, crop_top, looking_at_viewer, midriff, navel, off_shoulder, solo, bare_shoulders, cleavage, collarbone, grey_pants, long_sleeves, open_jacket, parted_bangs, black_pants, blush, grey_jacket, smile, standing, black_jacket, closed_mouth, cowboy_shot, jewelry, open_mouth, sidelocks, simple_background, white_background |
| 1 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, denim_shorts, short_shorts, crop_top, see-through, thighs, cleavage, navel, open_mouth, smile, black_bra, blush, collarbone, midriff, necklace, long_sleeves, simple_background, off-shoulder_shirt, standing, bandaid_on_face, cutoffs, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | belt | crop_top | looking_at_viewer | midriff | navel | off_shoulder | solo | bare_shoulders | cleavage | collarbone | grey_pants | long_sleeves | open_jacket | parted_bangs | black_pants | blush | grey_jacket | smile | standing | black_jacket | closed_mouth | cowboy_shot | jewelry | open_mouth | sidelocks | simple_background | white_background | denim_shorts | short_shorts | see-through | thighs | black_bra | necklace | off-shoulder_shirt | bandaid_on_face | cutoffs | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:----------|:--------|:---------------|:-------|:-----------------|:-----------|:-------------|:-------------|:---------------|:--------------|:---------------|:--------------|:--------|:--------------|:--------|:-----------|:---------------|:---------------|:--------------|:----------|:-------------|:------------|:--------------------|:-------------------|:---------------|:---------------|:--------------|:---------|:------------|:-----------|:---------------------|:------------------|:----------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | X | X | X | | X | X | X | X | | X | | | | X | | X | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
SeoyeonChoi/customDataset_llama2_kor | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MartinKu/bookcorpus_stage1_SV_20230316 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2091780208
num_examples: 109310887
download_size: 1356114102
dataset_size: 2091780208
---
# Dataset Card for "bookcorpus_stage1_SV_20230316"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tierdesafinante/caco_antibes_td | ---
license: openrail
---
|
hugfaceguy0001/LightNovels100kto120k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 140078399
num_examples: 474
download_size: 88310840
dataset_size: 140078399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahishamm/Modified_Augmented_PH2_db_sharpened | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': benign
'1': malignant
splits:
- name: train
num_bytes: 118056337.324
num_examples: 2714
- name: test
num_bytes: 25334494.0
num_examples: 584
download_size: 144483186
dataset_size: 143390831.324
---
# Dataset Card for "Modified_Augmented_PH2_db_sharpened"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Guid0Craft/Test | ---
license: apache-2.0
---
|
irds/lotte_technology_dev_search | ---
pretty_name: '`lotte/technology/dev/search`'
viewer: false
source_datasets: ['irds/lotte_technology_dev']
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/technology/dev/search`
The `lotte/technology/dev/search` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/technology/dev/search).
# Data
This dataset provides:
- `queries` (i.e., topics); count=916
- `qrels`: (relevance assessments); count=2,676
- For `docs`, use [`irds/lotte_technology_dev`](https://huggingface.co/datasets/irds/lotte_technology_dev)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/lotte_technology_dev_search', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/lotte_technology_dev_search', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
316usman/thematic3cembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 10912441
num_examples: 15053
download_size: 3349048
dataset_size: 10912441
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-55000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1063264
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.