id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
Juamzinhu/Daishikawa | 2023-09-18T22:18:04.000Z | [
"region:us"
] | Juamzinhu | null | null | null | 0 | 0 | Entry not found |
jac12/jen | 2023-09-18T23:18:10.000Z | [
"region:us"
] | jac12 | null | null | null | 0 | 0 | Entry not found |
jac12/mie | 2023-09-24T06:39:59.000Z | [
"region:us"
] | jac12 | null | null | null | 0 | 0 | Entry not found |
bsankar/github-issues | 2023-09-18T23:33:52.000Z | [
"region:us"
] | bsankar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
- name: is_closed
dtype: bool
- name: close_time
dtype: duration[us]
splits:
- name: train
num_bytes: 12125043
num_examples: 1000
download_size: 3282501
dataset_size: 12125043
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lloorree__kssht-dahj-70b | 2023-09-18T23:52:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lloorree/kssht-dahj-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-dahj-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7033014017574061,\n\
\ \"acc_stderr\": 0.03081446175839962,\n \"acc_norm\": 0.7072547203046122,\n\
\ \"acc_norm_stderr\": 0.03078306684205309,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n\
\ \"mc2_stderr\": 0.015115214729699759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6612627986348123,\n \"acc_stderr\": 0.013830568927974332,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403515\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6867157936666003,\n\
\ \"acc_stderr\": 0.0046288092584835265,\n \"acc_norm\": 0.8730332603067118,\n\
\ \"acc_norm_stderr\": 0.003322552829608905\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.02255655101013236,\n \
\ \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.02255655101013236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.908256880733945,\n \"acc_stderr\": 0.012376323409137103,\n \"\
acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137103\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\
\ \"acc_stderr\": 0.011884488905895538,\n \"acc_norm\": 0.8735632183908046,\n\
\ \"acc_norm_stderr\": 0.011884488905895538\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6458100558659218,\n\
\ \"acc_stderr\": 0.015995644947299225,\n \"acc_norm\": 0.6458100558659218,\n\
\ \"acc_norm_stderr\": 0.015995644947299225\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225184,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225184\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5612777053455019,\n\
\ \"acc_stderr\": 0.012673969883493268,\n \"acc_norm\": 0.5612777053455019,\n\
\ \"acc_norm_stderr\": 0.012673969883493268\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546195,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546195\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n\
\ \"mc2_stderr\": 0.015115214729699759\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-dahj-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet'
- config_name: results
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- results_2023-09-18T23-50-58.093131.parquet
- split: latest
path:
- results_2023-09-18T23-50-58.093131.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-dahj-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-dahj-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7033014017574061,
"acc_stderr": 0.03081446175839962,
"acc_norm": 0.7072547203046122,
"acc_norm_stderr": 0.03078306684205309,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
},
"harness|arc:challenge|25": {
"acc": 0.6612627986348123,
"acc_stderr": 0.013830568927974332,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403515
},
"harness|hellaswag|10": {
"acc": 0.6867157936666003,
"acc_stderr": 0.0046288092584835265,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.003322552829608905
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677098,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677098
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7282051282051282,
"acc_stderr": 0.02255655101013236,
"acc_norm": 0.7282051282051282,
"acc_norm_stderr": 0.02255655101013236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137103,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137103
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.011884488905895538,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.011884488905895538
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6458100558659218,
"acc_stderr": 0.015995644947299225,
"acc_norm": 0.6458100558659218,
"acc_norm_stderr": 0.015995644947299225
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225184,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225184
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5612777053455019,
"acc_stderr": 0.012673969883493268,
"acc_norm": 0.5612777053455019,
"acc_norm_stderr": 0.012673969883493268
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803403,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546195,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546195
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_lloorree__kssht-castor-70b | 2023-09-18T23:56:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lloorree/kssht-castor-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-castor-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7025630433354887,\n\
\ \"acc_stderr\": 0.03070323641112233,\n \"acc_norm\": 0.7065431366848456,\n\
\ \"acc_norm_stderr\": 0.03067233267965294,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n\
\ \"mc2_stderr\": 0.014865953800030475\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.01393680921215829,\n\
\ \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.01344952210993249\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6857199761003784,\n\
\ \"acc_stderr\": 0.004632797375289762,\n \"acc_norm\": 0.8753236407090221,\n\
\ \"acc_norm_stderr\": 0.003296764320821918\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n\
\ \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172523,\n \"\
acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172523\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888156,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5541069100391134,\n\
\ \"acc_stderr\": 0.012695244711379783,\n \"acc_norm\": 0.5541069100391134,\n\
\ \"acc_norm_stderr\": 0.012695244711379783\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7679738562091504,\n \"acc_stderr\": 0.01707737337785693,\n \
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.01707737337785693\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n\
\ \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n\
\ \"mc2_stderr\": 0.014865953800030475\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-castor-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet'
- config_name: results
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- results_2023-09-18T23-54-47.734205.parquet
- split: latest
path:
- results_2023-09-18T23-54-47.734205.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-castor-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-castor-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-castor-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7025630433354887,
"acc_stderr": 0.03070323641112233,
"acc_norm": 0.7065431366848456,
"acc_norm_stderr": 0.03067233267965294,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.01393680921215829,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.6857199761003784,
"acc_stderr": 0.004632797375289762,
"acc_norm": 0.8753236407090221,
"acc_norm_stderr": 0.003296764320821918
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172523,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172523
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5541069100391134,
"acc_stderr": 0.012695244711379783,
"acc_norm": 0.5541069100391134,
"acc_norm_stderr": 0.012695244711379783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.01707737337785693,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.01707737337785693
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
untilthend/kye | 2023-09-19T00:08:03.000Z | [
"license:openrail",
"region:us"
] | untilthend | null | null | null | 0 | 0 | ---
license: openrail
---
|
Kinuko4/Kinuko4 | 2023-09-19T00:10:56.000Z | [
"region:us"
] | Kinuko4 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_lloorree__kssht-euripedes-70b | 2023-09-19T00:14:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lloorree/kssht-euripedes-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-euripedes-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7032771782081723,\n\
\ \"acc_stderr\": 0.030834102504125972,\n \"acc_norm\": 0.70714084898032,\n\
\ \"acc_norm_stderr\": 0.030804015376568177,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n\
\ \"mc2_stderr\": 0.014893190834168417\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n\
\ \"acc_stderr\": 0.004626805906522211,\n \"acc_norm\": 0.8759211312487553,\n\
\ \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276274,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276274\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.016611393687268574,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.016611393687268574\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.02042395535477803,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.02042395535477803\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5580182529335072,\n\
\ \"acc_stderr\": 0.012683972513598827,\n \"acc_norm\": 0.5580182529335072,\n\
\ \"acc_norm_stderr\": 0.012683972513598827\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546188,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546188\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n\
\ \"mc2_stderr\": 0.014893190834168417\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-euripedes-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet'
- config_name: results
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- results_2023-09-19T00-12-39.048571.parquet
- split: latest
path:
- results_2023-09-19T00-12-39.048571.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-euripedes-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-euripedes-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7032771782081723,
"acc_stderr": 0.030834102504125972,
"acc_norm": 0.70714084898032,
"acc_norm_stderr": 0.030804015376568177,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522211,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02559185776138218,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02559185776138218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361276,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276274,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276274
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.016611393687268574,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.016611393687268574
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.02042395535477803,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.02042395535477803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5580182529335072,
"acc_stderr": 0.012683972513598827,
"acc_norm": 0.5580182529335072,
"acc_norm_stderr": 0.012683972513598827
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.01724238582877962,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.01724238582877962
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546188,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Euryale-L2-70B | 2023-09-19T00:31:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Euryale-L2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Euryale-L2-70B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-19T00:30:23.278534](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-09-19T00-30-23.278534.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6876811863365377,\n\
\ \"acc_stderr\": 0.031295794134185935,\n \"acc_norm\": 0.6915491918846361,\n\
\ \"acc_norm_stderr\": 0.03126665242528465,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5449359501718896,\n\
\ \"mc2_stderr\": 0.0149529759469292\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726092,\n\
\ \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053054\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6826329416450906,\n\
\ \"acc_stderr\": 0.004645003662067883,\n \"acc_norm\": 0.8707428799044015,\n\
\ \"acc_norm_stderr\": 0.003347986669565319\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337145,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337145\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596239,\n\
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125384,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078874,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078874\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687964,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687964\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8733944954128441,\n \"acc_stderr\": 0.014257128686165167,\n \"\
acc_norm\": 0.8733944954128441,\n \"acc_norm_stderr\": 0.014257128686165167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.02838039114709471,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.02838039114709471\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884865,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884865\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8441890166028098,\n\
\ \"acc_stderr\": 0.012969269247762578,\n \"acc_norm\": 0.8441890166028098,\n\
\ \"acc_norm_stderr\": 0.012969269247762578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967554,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5050279329608939,\n\
\ \"acc_stderr\": 0.016721656037538415,\n \"acc_norm\": 0.5050279329608939,\n\
\ \"acc_norm_stderr\": 0.016721656037538415\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.022282313949774882,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.022282313949774882\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.530638852672751,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.530638852672751,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7205882352941176,\n \"acc_stderr\": 0.01815287105153881,\n \
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.01815287105153881\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5449359501718896,\n\
\ \"mc2_stderr\": 0.0149529759469292\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Euryale-L2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet'
- config_name: results
data_files:
- split: 2023_09_19T00_30_23.278534
path:
- results_2023-09-19T00-30-23.278534.parquet
- split: latest
path:
- results_2023-09-19T00-30-23.278534.parquet
---
# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Euryale-L2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Euryale-L2-70B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T00:30:23.278534](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-09-19T00-30-23.278534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6876811863365377,
"acc_stderr": 0.031295794134185935,
"acc_norm": 0.6915491918846361,
"acc_norm_stderr": 0.03126665242528465,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5449359501718896,
"mc2_stderr": 0.0149529759469292
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726092,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053054
},
"harness|hellaswag|10": {
"acc": 0.6826329416450906,
"acc_stderr": 0.004645003662067883,
"acc_norm": 0.8707428799044015,
"acc_norm_stderr": 0.003347986669565319
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125384,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078874,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078874
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687964,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687964
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8733944954128441,
"acc_stderr": 0.014257128686165167,
"acc_norm": 0.8733944954128441,
"acc_norm_stderr": 0.014257128686165167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.02838039114709471,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.02838039114709471
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494732,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494732
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884865,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884865
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8441890166028098,
"acc_stderr": 0.012969269247762578,
"acc_norm": 0.8441890166028098,
"acc_norm_stderr": 0.012969269247762578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967554,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5050279329608939,
"acc_stderr": 0.016721656037538415,
"acc_norm": 0.5050279329608939,
"acc_norm_stderr": 0.016721656037538415
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.022282313949774882,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.022282313949774882
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.530638852672751,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.530638852672751,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.01815287105153881,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.01815287105153881
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5449359501718896,
"mc2_stderr": 0.0149529759469292
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yicozy/dataset_pfs_hr_by_subgroup | 2023-09-19T01:04:47.000Z | [
"region:us"
] | yicozy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6991314
num_examples: 8668
download_size: 0
dataset_size: 6991314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset_pfs_hr_by_subgroup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kalebguillot/CropNet | 2023-09-19T01:57:37.000Z | [
"region:us"
] | kalebguillot | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1 | 2023-09-19T01:13:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ICBU-NPU/FashionGPT-70B-V1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-19T01:12:17.792946](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-09-19T01-12-17.792946.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7059262484193939,\n\
\ \"acc_stderr\": 0.030707005871601658,\n \"acc_norm\": 0.7099061113487667,\n\
\ \"acc_norm_stderr\": 0.0306756964963288,\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104003,\n \"mc2\": 0.639247711774609,\n\
\ \"mc2_stderr\": 0.014721455246819812\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192302,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6819358693487353,\n\
\ \"acc_stderr\": 0.004647727222445383,\n \"acc_norm\": 0.8732324238199561,\n\
\ \"acc_norm_stderr\": 0.0033203245481454053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.035995863012470763,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.035995863012470763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267826,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247444,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247444\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8773946360153256,\n\
\ \"acc_stderr\": 0.011728672144131563,\n \"acc_norm\": 0.8773946360153256,\n\
\ \"acc_norm_stderr\": 0.011728672144131563\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.02207570925175718,\n\
\ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.02207570925175718\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046112,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257124,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257124\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5495436766623207,\n\
\ \"acc_stderr\": 0.012707390438502348,\n \"acc_norm\": 0.5495436766623207,\n\
\ \"acc_norm_stderr\": 0.012707390438502348\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.7565359477124183,\n \"acc_stderr\": 0.01736247376214661,\n \"\
acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.01736247376214661\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954619,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954619\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4749082007343941,\n\
\ \"mc1_stderr\": 0.017481446804104003,\n \"mc2\": 0.639247711774609,\n\
\ \"mc2_stderr\": 0.014721455246819812\n }\n}\n```"
repo_url: https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet'
- config_name: results
data_files:
- split: 2023_09_19T01_12_17.792946
path:
- results_2023-09-19T01-12-17.792946.parquet
- split: latest
path:
- results_2023-09-19T01-12-17.792946.parquet
---
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T01:12:17.792946](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-09-19T01-12-17.792946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7059262484193939,
"acc_stderr": 0.030707005871601658,
"acc_norm": 0.7099061113487667,
"acc_norm_stderr": 0.0306756964963288,
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104003,
"mc2": 0.639247711774609,
"mc2_stderr": 0.014721455246819812
},
"harness|arc:challenge|25": {
"acc": 0.6672354948805461,
"acc_stderr": 0.013769863046192302,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6819358693487353,
"acc_stderr": 0.004647727222445383,
"acc_norm": 0.8732324238199561,
"acc_norm_stderr": 0.0033203245481454053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.035995863012470763,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.035995863012470763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267826,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247444,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247444
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990915,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8773946360153256,
"acc_stderr": 0.011728672144131563,
"acc_norm": 0.8773946360153256,
"acc_norm_stderr": 0.011728672144131563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7861271676300579,
"acc_stderr": 0.02207570925175718,
"acc_norm": 0.7861271676300579,
"acc_norm_stderr": 0.02207570925175718
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.6,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257124,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257124
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5495436766623207,
"acc_stderr": 0.012707390438502348,
"acc_norm": 0.5495436766623207,
"acc_norm_stderr": 0.012707390438502348
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.01736247376214661,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.01736247376214661
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954619,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954619
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4749082007343941,
"mc1_stderr": 0.017481446804104003,
"mc2": 0.639247711774609,
"mc2_stderr": 0.014721455246819812
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Cerrisete07/Raimundo | 2023-10-07T23:37:47.000Z | [
"license:other",
"region:us"
] | Cerrisete07 | null | null | null | 0 | 0 | ---
license: other
---
|
Jamal914/Xtt | 2023-09-19T01:48:31.000Z | [
"region:us"
] | Jamal914 | null | null | null | 0 | 0 | Entry not found |
zhangbo2008/Shin_Onimusha_Dawn_of_Dreams | 2023-09-19T02:45:18.000Z | [
"region:us"
] | zhangbo2008 | null | null | null | 0 | 0 | Entry not found |
chenqile09/alpaca-2-13B-chinese-couplet-val-4k-predictions | 2023-09-19T03:25:35.000Z | [
"region:us"
] | chenqile09 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: data
path: data/data-*
dataset_info:
features:
- name: input
dtype: string
- name: prediction
dtype: string
- name: reference
dtype: string
splits:
- name: data
num_bytes: 386019
num_examples: 4000
download_size: 341079
dataset_size: 386019
---
# Dataset Card for "alpaca-2-13B-chinese-couplet-val-4k-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chunpingvi/temp_env | 2023-09-19T07:19:50.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
chenqile09/alpaca-2-7B-chinese-couplet-val-4k-predictions | 2023-09-19T03:45:31.000Z | [
"region:us"
] | chenqile09 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: data
path: data/data-*
dataset_info:
features:
- name: input
dtype: string
- name: prediction
dtype: string
- name: reference
dtype: string
splits:
- name: data
num_bytes: 386155
num_examples: 4000
download_size: 342185
dataset_size: 386155
---
# Dataset Card for "alpaca-2-7B-chinese-couplet-val-4k-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gilderlan/etinhozaudio | 2023-09-19T03:49:04.000Z | [
"region:us"
] | Gilderlan | null | null | null | 0 | 0 | |
sscla/market-data | 2023-09-19T11:03:45.000Z | [
"task_categories:summarization",
"task_categories:text2text-generation",
"language:en",
"license:openrail",
"region:us"
] | sscla | null | null | null | 0 | 0 | ---
license: openrail
task_categories:
- summarization
- text2text-generation
language:
- en
pretty_name: u
--- |
linhqyy/data_aug_cua | 2023-09-19T05:34:48.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 2162586
num_examples: 9941
- name: test
num_bytes: 241472
num_examples: 1105
download_size: 596926
dataset_size: 2404058
---
# Dataset Card for "data_aug_cua"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/jashinchandropkick | 2023-09-29T09:32:28.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Jashin-chan Dropkick
This is the image base of bangumi Jashin-chan Dropkick, we detected 44 characters, 6043 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1710 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 44 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 29 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 41 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 32 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 20 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 14 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 210 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 292 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 76 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 30 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 398 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 14 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 14 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 62 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 84 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 27 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 11 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 17 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 143 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 10 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 8 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 42 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 217 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 11 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 555 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 534 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 11 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 32 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 211 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 160 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 22 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 23 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 5 | [Download](37/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 38 | 355 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 33 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 7 | [Download](40/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 41 | 9 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 5 | [Download](42/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 419 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
wangdayaya/beauty-avatar | 2023-09-19T06:21:46.000Z | [
"region:us"
] | wangdayaya | null | null | null | 0 | 0 | Entry not found |
sameer7/plag | 2023-09-19T06:41:24.000Z | [
"region:us"
] | sameer7 | null | null | null | 0 | 0 | Entry not found |
aravindasai/taco_menu | 2023-09-19T07:00:09.000Z | [
"region:us"
] | aravindasai | null | null | null | 0 | 0 | Entry not found |
xlagor/FIT5215 | 2023-09-19T07:14:29.000Z | [
"region:us"
] | xlagor | null | null | null | 0 | 0 | Entry not found |
spideyrim/output | 2023-09-19T07:47:05.000Z | [
"region:us"
] | spideyrim | null | null | null | 0 | 0 | Entry not found |
SoloTech/ia | 2023-09-19T07:48:22.000Z | [
"region:us"
] | SoloTech | null | null | null | 0 | 0 | |
CCRs/qqp-Quora_Question_Pairs | 2023-09-27T05:32:33.000Z | [
"region:us"
] | CCRs | null | null | null | 0 | 0 | <a href="https://www.kaggle.com/competitions/quora-question-pairs/data?select=test.csv.zip">from</a> translated to kz dataset
---
license: mit
---
|
Fuyuka29/Ishar | 2023-09-19T07:51:34.000Z | [
"region:us"
] | Fuyuka29 | null | null | null | 0 | 0 | Entry not found |
CyberNative/CryptoTorgashi | 2023-09-19T07:55:27.000Z | [
"region:us"
] | CyberNative | null | null | null | 0 | 0 | Entry not found |
Falah/chapter1_0_prompts | 2023-09-20T10:08:54.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2523
num_examples: 9
download_size: 3966
dataset_size: 2523
---
# Dataset Card for "chapter1_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter1_1_prompts | 2023-09-20T10:09:01.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 289
num_examples: 1
download_size: 2462
dataset_size: 289
---
# Dataset Card for "chapter1_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter1_2_prompts | 2023-09-19T08:22:52.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 151
num_examples: 1
download_size: 1499
dataset_size: 151
---
# Dataset Card for "chapter1_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KnutJaegersberg/facehugger | 2023-09-19T08:25:23.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | KnutJaegersberg | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
Just a downsampled version of the trilobite. I tested it with a 3b model, it has comparable or even superior benchmarking results as the trilobite. |
nc33/clmdata | 2023-09-19T08:36:10.000Z | [
"region:us"
] | nc33 | null | null | null | 0 | 0 | Entry not found |
bongo2112/alikiba-SDxl-Comic-Style-Outputs | 2023-09-23T11:23:09.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
Falah/chapter2_0_prompts | 2023-09-20T10:15:42.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3019
num_examples: 10
download_size: 4623
dataset_size: 3019
---
# Dataset Card for "chapter2_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter2_1_prompts | 2023-09-20T10:15:50.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2873
num_examples: 14
download_size: 3760
dataset_size: 2873
---
# Dataset Card for "chapter2_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter3_0_prompts | 2023-09-20T10:19:41.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 4051
num_examples: 13
download_size: 5038
dataset_size: 4051
---
# Dataset Card for "chapter3_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter3_1_prompts | 2023-09-20T10:19:49.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3101
num_examples: 10
download_size: 4529
dataset_size: 3101
---
# Dataset Card for "chapter3_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter4_0_prompts | 2023-09-19T08:54:27.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2120
num_examples: 6
download_size: 3737
dataset_size: 2120
---
# Dataset Card for "chapter4_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter4_1_prompts | 2023-09-19T08:54:31.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3574
num_examples: 12
download_size: 4412
dataset_size: 3574
---
# Dataset Card for "chapter4_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter5_0_prompts | 2023-09-19T08:56:33.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2858
num_examples: 8
download_size: 4636
dataset_size: 2858
---
# Dataset Card for "chapter5_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter5_1_prompts | 2023-09-19T08:56:37.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3741
num_examples: 9
download_size: 5533
dataset_size: 3741
---
# Dataset Card for "chapter5_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter6_0_prompts | 2023-09-19T08:58:22.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3832
num_examples: 15
download_size: 4044
dataset_size: 3832
---
# Dataset Card for "chapter6_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter6_1_prompts | 2023-09-19T08:58:26.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2772
num_examples: 9
download_size: 3664
dataset_size: 2772
---
# Dataset Card for "chapter6_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter7_0_prompts | 2023-09-19T08:59:28.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2761
num_examples: 9
download_size: 3857
dataset_size: 2761
---
# Dataset Card for "chapter7_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter7_1_prompts | 2023-09-19T08:59:32.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3050
num_examples: 10
download_size: 3219
dataset_size: 3050
---
# Dataset Card for "chapter7_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter8_0_prompts | 2023-09-19T09:00:33.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3238
num_examples: 11
download_size: 4431
dataset_size: 3238
---
# Dataset Card for "chapter8_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter8_1_prompts | 2023-09-19T09:00:37.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2646
num_examples: 9
download_size: 3300
dataset_size: 2646
---
# Dataset Card for "chapter8_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter9_0_prompts | 2023-09-19T09:01:48.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2878
num_examples: 10
download_size: 4316
dataset_size: 2878
---
# Dataset Card for "chapter9_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter9_1_prompts | 2023-09-19T09:01:52.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3673
num_examples: 11
download_size: 4563
dataset_size: 3673
---
# Dataset Card for "chapter9_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liaochuweidavid/swahili_emo | 2023-09-19T09:23:02.000Z | [
"region:us"
] | liaochuweidavid | null | null | null | 0 | 0 | Entry not found |
eswardivi/Malayalam_MSA | 2023-09-19T09:37:24.000Z | [
"region:us"
] | eswardivi | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Negative
'1': Neutral
'2': Positive
splits:
- name: train
num_bytes: 91107974.0
num_examples: 70
download_size: 90971844
dataset_size: 91107974.0
---
# Dataset Card for "Malayalam_MSA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter10_0_prompts | 2023-09-19T16:05:31.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2212
num_examples: 6
download_size: 4047
dataset_size: 2212
---
# Dataset Card for "chapter10_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/chapter10_1_prompts | 2023-09-19T16:05:34.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2850
num_examples: 10
download_size: 4171
dataset_size: 2850
---
# Dataset Card for "chapter10_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Andyrasika/cow_dataset | 2023-09-19T09:57:01.000Z | [
"region:us"
] | Andyrasika | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int32
- name: image
dtype: image
splits:
- name: train
num_bytes: 145565588.0
num_examples: 51
download_size: 130979749
dataset_size: 145565588.0
---
# Dataset Card for "cow_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lunaluan/chatbox2_history | 2023-10-11T01:22:00.000Z | [
"region:us"
] | lunaluan | null | null | null | 0 | 0 | Entry not found |
mrchtr/cc-test | 2023-09-19T10:20:06.000Z | [
"region:us"
] | mrchtr | null | null | null | 0 | 0 | Entry not found |
Back-up/review | 2023-09-19T10:20:20.000Z | [
"region:us"
] | Back-up | null | null | null | 0 | 0 | Entry not found |
Jan150000/visual | 2023-09-19T10:30:00.000Z | [
"license:openrail",
"region:us"
] | Jan150000 | null | null | null | 0 | 0 | ---
license: openrail
---
|
jj1000/files | 2023-09-19T10:32:40.000Z | [
"region:us"
] | jj1000 | null | null | null | 0 | 0 | Entry not found |
Datasaur/mongabay-experiment | 2023-09-19T10:51:05.000Z | [
"region:us"
] | Datasaur | null | null | null | 0 | 0 | Entry not found |
dvrkdvys/ted_cruz | 2023-09-19T11:26:34.000Z | [
"region:us"
] | dvrkdvys | null | null | null | 0 | 0 | Entry not found |
HarshSinyal/AnomolyDetection_DL_PickleFile | 2023-09-19T11:02:03.000Z | [
"license:afl-3.0",
"region:us"
] | HarshSinyal | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
CyberHarem/xie_shen_chiyan_jashinchandropkick | 2023-09-19T11:12:33.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 邪神ちゃん
This is the dataset of 邪神ちゃん, containing 299 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 299 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 684 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 299 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 299 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 299 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 299 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 299 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 684 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 684 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 684 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hua_yuan_yurine_jashinchandropkick | 2023-09-19T11:38:22.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 花園ゆりね
This is the dataset of 花園ゆりね, containing 276 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 276 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 638 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 276 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 276 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 276 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 276 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 276 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 638 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 638 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 638 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
lunaluan/chatbox3_history | 2023-10-11T01:22:16.000Z | [
"region:us"
] | lunaluan | null | null | null | 0 | 0 | Entry not found |
AdrianM0/bicerano_polymers | 2023-09-19T11:48:21.000Z | [
"license:mit",
"region:us"
] | AdrianM0 | null | null | null | 0 | 0 | ---
license: mit
---
|
dhenypatungka/X-RealistcV.1.1 | 2023-09-19T11:54:46.000Z | [
"region:us"
] | dhenypatungka | null | null | null | 0 | 0 | Entry not found |
CyberHarem/pekora_jashinchandropkick | 2023-09-19T11:57:19.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Pekora
This is the dataset of Pekora, containing 276 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 276 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 627 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 276 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 276 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 276 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 276 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 276 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 627 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 627 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 627 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bongo2112/samia-SDxl-Comic-Style-Outputs | 2023-09-19T12:06:54.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/medeyusa_jashinchandropkick | 2023-09-19T12:14:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of メデューサ
This is the dataset of メデューサ, containing 268 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 268 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 591 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 268 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 268 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 268 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 268 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 268 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 591 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 591 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 591 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
vaibhavaHCL/English_fixlet_dataset_new | 2023-09-19T12:13:44.000Z | [
"region:us"
] | vaibhavaHCL | null | null | null | 0 | 0 | Entry not found |
yejeekang/legal_cn_instruction | 2023-09-19T12:16:24.000Z | [
"license:afl-3.0",
"region:us"
] | yejeekang | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
CyberHarem/minosu_jashinchandropkick | 2023-09-19T12:35:55.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ミノス
This is the dataset of ミノス, containing 283 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 283 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 673 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 283 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 283 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 283 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 283 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 283 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 673 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 673 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 673 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bongo2112/mwijaku-SDxl-Comic-Style-Outputs | 2023-09-19T15:20:41.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
razagigz/dataset_storage | 2023-09-26T10:57:28.000Z | [
"region:us"
] | razagigz | null | null | null | 0 | 0 | Entry not found |
bongo2112/magufuli-SDxl-Comic-Style-Outputs | 2023-09-19T15:55:28.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/poporon_jashinchandropkick | 2023-09-19T13:02:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ぽぽろん
This is the dataset of ぽぽろん, containing 269 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 269 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 668 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 269 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 269 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 269 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 269 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 269 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 668 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 668 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 668 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/perusepone2shi_jashinchandropkick | 2023-09-19T13:15:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ペルセポネ2世
This is the dataset of ペルセポネ2世, containing 144 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 144 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 330 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 144 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 144 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 144 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 144 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 144 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 330 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 330 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 330 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/rieru_jashinchandropkick | 2023-09-19T13:29:16.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of リエール
This is the dataset of リエール, containing 178 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 178 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 363 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 178 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 178 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 178 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 178 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 178 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 363 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 363 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 363 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/pino_jashinchandropkick | 2023-09-19T13:51:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ぴの
This is the dataset of ぴの, containing 201 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 201 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 480 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 201 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 201 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 201 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 201 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 201 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 480 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 480 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 480 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
xlzoiolx/loud-coringa-v0111 | 2023-09-19T14:00:10.000Z | [
"license:cc",
"region:us"
] | xlzoiolx | null | null | null | 0 | 0 | ---
license: cc
---
|
DialogueCharacter/chinese_general_instruction_with_reward_score | 2023-09-19T14:07:52.000Z | [
"region:us"
] | DialogueCharacter | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: reward_score
dtype: float64
splits:
- name: train
num_bytes: 1634095908
num_examples: 1169201
download_size: 998968518
dataset_size: 1634095908
---
# Dataset Card for "chinese_general_instruction_with_reward_score"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DSSGxMunich/land_parcels | 2023-10-06T10:54:12.000Z | [
"license:mit",
"region:us"
] | DSSGxMunich | null | null | null | 1 | 0 | ---
license: mit
---
# Dataset Card for Land Parcels
## Dataset Description
**Homepage:** [DSSGx Munich](https://sites.google.com/view/dssgx-munich-2023/startseite) organization page.
**Repository:** [GitHub](https://github.com/DSSGxMunich/land-sealing-dataset-and-analysis).
### Dataset Summary
This dataset contains information about land parcels with building plans in the region of Northern-Rhine Westphalia. It was downloaded from the [NRW Geoportal.](https://www.geoportal.nrw/?activetab=portal)
## Dataset Structure
### Data Fields
- **objectid**: Unique ID used for each land parcel.
- **planid**: ID from the Geoportal.
- **levelplan**: Spatial plan level. Classification scheme specified, required according to the INSPIRE term “infra-local”.
- **name**: Name of building plan.
- **kommune**: Municipality.
- **gkz**: Municipal code of the municipality.
- **nr**: In the municipality only once occurring number of the plan. Can also combine letters and numbers. The numbering system should be defined by the municipality before the first plan entry.
- **besch**: Brief description of the building plan.
- **aend**: Use dependent on building name and reference to change.
- **aendert**: The ID of the original plan that is being changed.
- **stand**: According to code list legal status.
- **planart**: Type of plan/statute.
- **datum**: Date of building plan.
- **scanurl**: URL of the scanned building plan.
- **texturl**: URL to textual determinations, necessary if scanned plan does not contain them.
- **legendeurl**: URL to the plan symbols (legend) necessary, if plan symbols and determinations are available separately.
- **sonsturl**: If the URL under "Document" only links to the actual plan drawing (without supplementary documents), then a URL must be added under this field, under which the corresponding additional content (such as expert opinions) can be found.
- **verfahren**: Type of plan procedure.
- **plantyp**: The type of the plan.
- **aendnr**: Number of the modified plan.
- **begruendurl**: URL for justification. Necessary if justification and determinations are available separately.
- **umweltberurl**: URL to the environmental report. Necessary if report and determinations are available separately.
- **erklaerungurl**: URL to the recapitulative statement. Necessary if declaration and determinations are available separately.
- **shape_Length**: The length of the land parcel
- **shape_Area**: The area of the land parcel
- **regional_plan_id**: Unique ID of regional plan the land parcel is in.
- **regional_plan_name**: Name of the regional plan the land parcel is in.
- **ART**: Unique ID of regional plan the land parcel is in.
- **geometry**: Geographical location of the land parcel.
More information on this fields can be found in the NRW documentation [here](https://www.bauportal.nrw/system/files/media/document/file/21-12-08-bauleitplanung_benutzerhandbuch.pdf).
### Source Data
#### Initial Data Collection
The data was downloaded from the NRW Geoportal, selecting the information of building plans and exporting the features as a GeoJSon in QGIS.
|
abbottcolette/storage | 2023-09-19T14:39:32.000Z | [
"region:us"
] | abbottcolette | null | null | null | 0 | 0 | Entry not found |
justinpinkney/pokemon-blip-captions-wds | 2023-09-19T14:32:11.000Z | [
"region:us"
] | justinpinkney | null | null | null | 0 | 0 | Webdataset version of: [lambdalabs/pokemon-blip-captions](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions) |
basncy/test-set | 2023-09-19T14:38:45.000Z | [
"license:apache-2.0",
"region:us"
] | basncy | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
DucHaiten/NSFW | 2023-10-10T06:20:57.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | DucHaiten | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
vikp/clean_notebooks_filtered | 2023-09-19T16:02:46.000Z | [
"region:us"
] | vikp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: code
dtype: string
- name: kind
dtype: string
- name: parsed_code
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 3018948094.822456
num_examples: 195900
download_size: 1476349379
dataset_size: 3018948094.822456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "clean_notebooks_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
newdboy/cj_ko_words | 2023-09-19T15:58:43.000Z | [
"license:openrail",
"region:us"
] | newdboy | null | null | null | 0 | 0 | ---
license: openrail
---
|
linhqyy/data_aug_full | 2023-09-19T16:02:48.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 1710412
num_examples: 8096
- name: test
num_bytes: 147335
num_examples: 704
download_size: 439882
dataset_size: 1857747
---
# Dataset Card for "data_aug_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yoshida_yuko_thedemongirlnextdoor | 2023-09-19T16:18:14.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yoshida Yuko
This is the dataset of Yoshida Yuko, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 729 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 729 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 729 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 729 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Gilderlan/dataset | 2023-09-19T16:29:13.000Z | [
"region:us"
] | Gilderlan | null | null | null | 0 | 0 | Entry not found |
mashiramaru/achylisflie | 2023-09-19T16:44:59.000Z | [
"license:other",
"region:us"
] | mashiramaru | null | null | null | 0 | 0 | ---
license: other
---
|
toninhodjj/khortex | 2023-09-19T16:53:43.000Z | [
"region:us"
] | toninhodjj | null | null | null | 0 | 0 | Entry not found |
CyberHarem/chiyoda_momo_thedemongirlnextdoor | 2023-09-19T17:07:50.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Chiyoda Momo
This is the dataset of Chiyoda Momo, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 749 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 749 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 749 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 749 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.