datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Hansollll/korquad_v1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 65474804
num_examples: 48325
- name: test
num_bytes: 16380895
num_examples: 12082
download_size: 50475250
dataset_size: 81855699
---
# Dataset Card for "korquad_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lighteval/GPT3_unscramble | ---
dataset_info:
features:
- name: context
dtype: string
- name: completion
dtype: string
splits:
- name: mid_word_1_anagrams
num_bytes: 271516
num_examples: 10000
- name: mid_word_2_anagrams
num_bytes: 282654
num_examples: 10000
- name: cycle_letters_in_word
num_bytes: 282654
num_examples: 10000
- name: random_insertion_in_word
num_bytes: 353981
num_examples: 10000
- name: reversed_words
num_bytes: 282654
num_examples: 10000
download_size: 1131195
dataset_size: 1473459
---
# Dataset Card for "unscramble_GPT3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malucoelhaofc/ScottTenormanPortuguesV2 | ---
license: openrail
---
|
openaccess-ai-collective/e97ecd7e2a386b2a440226001b2f8f83 | Invalid username or password. |
andersonbcdefg/lmsys_utterances | ---
dataset_info:
features:
- name: user_utterance
dtype: string
splits:
- name: train
num_bytes: 344117006
num_examples: 1410658
download_size: 229656803
dataset_size: 344117006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cafbr/exchange | ---
license: openrail
task_categories:
- feature-extraction
language:
- en
tags:
- finance
pretty_name: open banking brazil exchange data
size_categories:
- n<1K
--- |
arthurmluz/wikilingua_data-wiki_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 21826073
num_examples: 8165
download_size: 12802622
dataset_size: 21826073
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-wiki_results"
rouge= {'rouge1': 0.34956276279808024, 'rouge2': 0.14816108637651773, 'rougeL': 0.27501599207153526, 'rougeLsum': 0.27501599207153526}
bert= {'precision': 0.7906100700329362, 'recall': 0.7631471133823419, 'f1': 0.7758486540348195}
moverscore: 0.6231903700836654 |
open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test | ---
pretty_name: Evaluation run of seyf1elislam/WestKunai-Hermes-10.7b-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [seyf1elislam/WestKunai-Hermes-10.7b-test](https://huggingface.co/seyf1elislam/WestKunai-Hermes-10.7b-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T21:56:56.128733](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test/blob/main/results_2024-03-21T21-56-56.128733.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6463508323015085,\n\
\ \"acc_stderr\": 0.03215718377225816,\n \"acc_norm\": 0.648726588285133,\n\
\ \"acc_norm_stderr\": 0.03280877605965653,\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6428048665212414,\n\
\ \"mc2_stderr\": 0.01571703778093368\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.01362169611917331\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7034455287791277,\n\
\ \"acc_stderr\": 0.004558049018764654,\n \"acc_norm\": 0.8710416251742681,\n\
\ \"acc_norm_stderr\": 0.003344689038650326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n\
\ \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n\
\ \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"\
acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726853,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856654,\n \
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856654\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n\
\ \"acc_stderr\": 0.016129271025099843,\n \"acc_norm\": 0.8293577981651377,\n\
\ \"acc_norm_stderr\": 0.016129271025099843\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n\
\ \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.0329109957861577,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.0329109957861577\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792584,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959617,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959617\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n\
\ \"acc_stderr\": 0.012767457253930647,\n \"acc_norm\": 0.4895697522816167,\n\
\ \"acc_norm_stderr\": 0.012767457253930647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623223,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623223\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174917,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174917\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6428048665212414,\n\
\ \"mc2_stderr\": 0.01571703778093368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.01062696452997186\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5185746777862017,\n \
\ \"acc_stderr\": 0.013762977910317584\n }\n}\n```"
repo_url: https://huggingface.co/seyf1elislam/WestKunai-Hermes-10.7b-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-56-56.128733.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T21-56-56.128733.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- '**/details_harness|winogrande|5_2024-03-21T21-56-56.128733.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T21-56-56.128733.parquet'
- config_name: results
data_files:
- split: 2024_03_21T21_56_56.128733
path:
- results_2024-03-21T21-56-56.128733.parquet
- split: latest
path:
- results_2024-03-21T21-56-56.128733.parquet
---
# Dataset Card for Evaluation run of seyf1elislam/WestKunai-Hermes-10.7b-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [seyf1elislam/WestKunai-Hermes-10.7b-test](https://huggingface.co/seyf1elislam/WestKunai-Hermes-10.7b-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T21:56:56.128733](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__WestKunai-Hermes-10.7b-test/blob/main/results_2024-03-21T21-56-56.128733.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6463508323015085,
"acc_stderr": 0.03215718377225816,
"acc_norm": 0.648726588285133,
"acc_norm_stderr": 0.03280877605965653,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.017501285074551835,
"mc2": 0.6428048665212414,
"mc2_stderr": 0.01571703778093368
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.01362169611917331
},
"harness|hellaswag|10": {
"acc": 0.7034455287791277,
"acc_stderr": 0.004558049018764654,
"acc_norm": 0.8710416251742681,
"acc_norm_stderr": 0.003344689038650326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726853,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856654,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099843,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099843
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.0329109957861577,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.0329109957861577
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792584,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959617,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4895697522816167,
"acc_stderr": 0.012767457253930647,
"acc_norm": 0.4895697522816167,
"acc_norm_stderr": 0.012767457253930647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623223,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623223
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174917,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174917
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.017501285074551835,
"mc2": 0.6428048665212414,
"mc2_stderr": 0.01571703778093368
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.01062696452997186
},
"harness|gsm8k|5": {
"acc": 0.5185746777862017,
"acc_stderr": 0.013762977910317584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pythainlp/thai-cc-license | ---
dataset_info:
features:
- name: src
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 319024
num_examples: 6
download_size: 105664
dataset_size: 319024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
task_categories:
- text-generation
language:
- th
tags:
- law
- license
size_categories:
- n<1K
---
# Thai CC License
The dataset collect all Thai Creative Commons License.
License Dataset is public domain. |
one-sec-cv12/chunk_20 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 26697406032.125
num_examples: 277959
download_size: 23859952631
dataset_size: 26697406032.125
---
# Dataset Card for "chunk_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tofighi/bitcoin | ---
license: apache-2.0
---
|
dim/huggingartists_raw | ---
dataset_info:
features:
- name: text
dtype: string
- name: prompt
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 121693362
num_examples: 69312
download_size: 56195290
dataset_size: 121693362
---
# Dataset Card for "huggingartists_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rishabhjain16/ct_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 1338289.0
num_examples: 10
download_size: 925365
dataset_size: 1338289.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Caraaaaa/synthetic_image_text | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 609931.0
num_examples: 100
- name: validation
num_bytes: 306973.0
num_examples: 50
download_size: 884807
dataset_size: 916904.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
davanstrien/kto_maybe | Invalid username or password. |
dylanebert/3d-arena | ---
license: mit
tags:
- image-to-3d
--- |
mainlp/inconsistencies_forex | ---
license: cc-by-4.0
---
|
open-llm-leaderboard/details_macadeliccc__gemma-orchid-7b-dpo | ---
pretty_name: Evaluation run of macadeliccc/gemma-orchid-7b-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/gemma-orchid-7b-dpo](https://huggingface.co/macadeliccc/gemma-orchid-7b-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__gemma-orchid-7b-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T16:01:57.860566](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__gemma-orchid-7b-dpo/blob/main/results_2024-02-29T16-01-57.860566.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144823622146915,\n\
\ \"acc_stderr\": 0.03285321994830655,\n \"acc_norm\": 0.6176876953771201,\n\
\ \"acc_norm_stderr\": 0.033514279111634626,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5327250800320222,\n\
\ \"mc2_stderr\": 0.015159004173001832\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427005,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.01411797190114282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6099382593108943,\n\
\ \"acc_stderr\": 0.004867670042866693,\n \"acc_norm\": 0.8095000995817566,\n\
\ \"acc_norm_stderr\": 0.0039189285565904754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400496,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400496\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462822,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462822\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n\
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552379,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.01653061740926685,\n \"acc_norm\"\
: 0.818348623853211,\n \"acc_norm_stderr\": 0.01653061740926685\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156214,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156214\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922531,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922531\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316556,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316556\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.01433352205921789,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.01433352205921789\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.02709865262130175,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.02709865262130175\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032193,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032193\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.029520095697687758,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.029520095697687758\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459595,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459595\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5327250800320222,\n\
\ \"mc2_stderr\": 0.015159004173001832\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5018953752843063,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/gemma-orchid-7b-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|arc:challenge|25_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|gsm8k|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hellaswag|10_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-01-57.860566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T16-01-57.860566.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- '**/details_harness|winogrande|5_2024-02-29T16-01-57.860566.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T16-01-57.860566.parquet'
- config_name: results
data_files:
- split: 2024_02_29T16_01_57.860566
path:
- results_2024-02-29T16-01-57.860566.parquet
- split: latest
path:
- results_2024-02-29T16-01-57.860566.parquet
---
# Dataset Card for Evaluation run of macadeliccc/gemma-orchid-7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/gemma-orchid-7b-dpo](https://huggingface.co/macadeliccc/gemma-orchid-7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__gemma-orchid-7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T16:01:57.860566](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__gemma-orchid-7b-dpo/blob/main/results_2024-02-29T16-01-57.860566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144823622146915,
"acc_stderr": 0.03285321994830655,
"acc_norm": 0.6176876953771201,
"acc_norm_stderr": 0.033514279111634626,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5327250800320222,
"mc2_stderr": 0.015159004173001832
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427005,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.01411797190114282
},
"harness|hellaswag|10": {
"acc": 0.6099382593108943,
"acc_stderr": 0.004867670042866693,
"acc_norm": 0.8095000995817566,
"acc_norm_stderr": 0.0039189285565904754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400496,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462822,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552379,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.01653061740926685,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.01653061740926685
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156214,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156214
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922531,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922531
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316556,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316556
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.01433352205921789,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.01433352205921789
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.02709865262130175,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.02709865262130175
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868052,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868052
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032193,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032193
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459595,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459595
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5327250800320222,
"mc2_stderr": 0.015159004173001832
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126735
},
"harness|gsm8k|5": {
"acc": 0.5018953752843063,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NeroUCH/online-health-chating | ---
license: pddl
task_categories:
- question-answering
- table-question-answering
language:
- zh
tags:
- healthcare
- chat
- llm
- medical
size_categories:
- 100K<n<1M
---
---
license: pddl
---
# Online Health Chating
This is the repository for the Online Health Chating project.
which is the dataset of [chathealth](https://github.com/NeroHin/ChatHealth.git) project.
> Alarm: This dataset isfor academic research only and any commercial use and clinical use is prohibited.
## Dataset
We used crawler to collect the data from the following websites:
- [KingNet](http://www.kingnet.com.tw/)
| Item | Size |
| :----: | :----: |
| Row | 91,735 |
- [問 8 健康咨詢](https://tw.wen8health.com/)
| Item | Size |
| :----: | :----: |
| Row | 4,919 |
- [臺灣 E 院](https://sp1.hso.mohw.gov.tw/doctor/)
| Item | Size |
| :----: | :----: |
| Row | 153,251 |
- [家庭醫生](https://www.familydoctor.com.cn/)
| Item | Size |
| :----: | :----: |
| Row | 577,849 |
## LLM Dataset
Then we concatenate the data and split it into train, dev set with 7:3 ratio.
- train.json
- dev.json
| question | answer |
| :----: | :----: |
| e.g. 有什麼方法可以治療腎結石? | 有的,腎結石的治療方法有很多種,包括藥物治療、手術治療、醫療治療、中醫治療等。 |
```json
{
"question": "有什麼方法可以治療腎結石?",
"answer": "有的,腎結石的治療方法有很多種,包括藥物治療、手術治療、醫療治療、中醫治療等。"
}
``` |
pubmed | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
- text-classification
task_ids:
- language-modeling
- masked-language-modeling
- text-scoring
- topic-classification
paperswithcode_id: pubmed
pretty_name: PubMed
tags:
- citation-estimation
dataset_info:
- config_name: '2024'
features:
- name: MedlineCitation
struct:
- name: PMID
dtype: int32
- name: DateCompleted
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: NumberOfReferences
dtype: int32
- name: DateRevised
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: Article
struct:
- name: Abstract
struct:
- name: AbstractText
dtype: string
- name: ArticleTitle
dtype: string
- name: AuthorList
struct:
- name: Author
sequence:
- name: LastName
dtype: string
- name: ForeName
dtype: string
- name: Initials
dtype: string
- name: CollectiveName
dtype: string
- name: Language
dtype: string
- name: GrantList
struct:
- name: Grant
sequence:
- name: GrantID
dtype: string
- name: Agency
dtype: string
- name: Country
dtype: string
- name: PublicationTypeList
struct:
- name: PublicationType
sequence: string
- name: MedlineJournalInfo
struct:
- name: Country
dtype: string
- name: ChemicalList
struct:
- name: Chemical
sequence:
- name: RegistryNumber
dtype: string
- name: NameOfSubstance
dtype: string
- name: CitationSubset
dtype: string
- name: MeshHeadingList
struct:
- name: MeshHeading
sequence:
- name: DescriptorName
dtype: string
- name: QualifierName
dtype: string
- name: PubmedData
struct:
- name: ArticleIdList
sequence:
- name: ArticleId
sequence: string
- name: PublicationStatus
dtype: string
- name: History
struct:
- name: PubMedPubDate
sequence:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: ReferenceList
sequence:
- name: Citation
dtype: string
- name: CitationId
dtype: int32
splits:
- name: train
num_bytes: 54723097181
num_examples: 36555430
download_size: 45202943276
dataset_size: 54723097181
---
# Dataset Card for PubMed
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** : [https://www.nlm.nih.gov/databases/download/pubmed_medline.html]()
- **Documentation:** : [https://www.nlm.nih.gov/databases/download/pubmed_medline_documentation.html]()
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [National Center for Biotechnology Information](mailto:info@ncbi.nlm.nih.gov)
### Dataset Summary
PubMed comprises more than 36 million citations for biomedical literature from MEDLINE, life science journals, and online books. Citations may include links to full-text content from PubMed Central and publisher web sites.
NLM produces a baseline set of PubMed citation records in XML format for download on an annual basis. The annual baseline is released in December of each year.
- Last Updated December 15, 2023
Each day, NLM produces update files that include new, revised, and deleted citations.
Source: https://ftp.ncbi.nlm.nih.gov/pubmed/README.txt
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
- English
## Dataset Structure
Bear in mind the data comes from XML that have various tags that are hard to reflect
in a concise JSON format. Tags and list are kind of non "natural" to XML documents
leading this library to make some choices regarding data. "Journal" info was dropped
altogether as it would have led to many fields being empty all the time.
The hierarchy is also a bit unnatural but the choice was made to keep as close as
possible to the original data for future releases that may change schema from NLM's side.
Author has been kept and contains either "ForeName", "LastName", "Initials", or "CollectiveName".
(All the fields will be present all the time, but only some will be filled)
### Data Instances
```json
{
"MedlineCitation": {
"PMID": 0,
"DateCompleted": {"Year": 0, "Month": 0, "Day": 0},
"NumberOfReferences": 0,
"DateRevised": {"Year": 0, "Month": 0, "Day": 0},
"Article": {
"Abstract": {"AbstractText": "Some abstract (can be missing)" },
"ArticleTitle": "Article title",
"AuthorList": {"Author": [
{"FirstName": "John", "ForeName": "Doe", "Initials": "JD", "CollectiveName": ""}
{"CollectiveName": "The Manhattan Project", "FirstName": "", "ForeName": "", "Initials": ""}
]},
"Language": "en",
"GrantList": {
"Grant": [],
},
"PublicationTypeList": {"PublicationType": []},
},
"MedlineJournalInfo": {"Country": "France"},
"ChemicalList": {"Chemical": [{
"RegistryNumber": "XX",
"NameOfSubstance": "Methanol"
}]},
"CitationSubset": "AIM",
"MeshHeadingList": {
"MeshHeading": [],
},
},
"PubmedData": {
"ArticleIdList": {"ArticleId": "10.1002/bjs.1800650203"},
"PublicationStatus": "ppublish",
"History": {"PubMedPubDate": [{"Year": 0, "Month": 0, "Day": 0}]},
"ReferenceList": [{"Citation": "Somejournal", "CitationId": 01}],
},
}
```
### Data Fields
Main Fields will probably interest people are:
- "MedlineCitation" > "Article" > "AuthorList" > "Author"
- "MedlineCitation" > "Article" > "Abstract" > "AbstractText"
- "MedlineCitation" > "Article" > "Article Title"
- "MedlineCitation" > "ChemicalList" > "Chemical"
- "MedlineCitation" > "NumberOfReferences"
### Data Splits
There are no splits in this dataset. It is given as is.
## Dataset Creation
### Curation Rationale
The use of "Medline" in an element name does not mean the record represents a citation from a MEDLINE-selected journal. When the NLM DTDs and XML elements were first created, MEDLINE records were the only data exported. Now NLM exports citations other than MEDLINE records. To minimize unnecessary disruption to users of the data, NLM has retained the original element names (e.g., MedlineCitation, MedlineJournalInfo, MedlineTA).
Policies affecting data creation have evolved over the years. Some PubMed records are added or revised well after the cited article was first published. In these cases, on occasion an element that had not yet been created when the article was published may appear on the record. For example, the Abstract element was not created until 1975, but some records published before 1975 but added to PubMed after 1975 contain <Abstract>. It is also possible that an element may be treated differently from the way it would have been treated had the record been created or maintained near the time the article was published. For example, the number of <Author> occurrences can diverge from the policies stated in the NLM author indexing policy (https://pubmed.ncbi.nlm.nih.gov/help/#author-indexing-policy). Lastly, as of October 2016, the publisher of the original article has the capability to edit the PubMed record’s citation data, with the exception of MeSH data, using the PubMed Data Management system. PubMed record data for older citations, therefore, may contain data for elements that didn’t exist when the citation was created.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[National Library of Medicine Terms and Conditions](https://www.nlm.nih.gov/databases/download/terms_and_conditions.html)
Downloading PubMed data from the National Library of Medicine FTP servers indicates your acceptance of the following Terms and Conditions. No charges, usage fees or royalties are paid to NLM for these data.
#### PubMed Specific Terms:
NLM freely provides PubMed data. Please note some abstracts may be protected by copyright.
#### General Terms and Conditions
Users of the data agree to:
- acknowledge NLM as the source of the data in a clear and conspicuous manner,
- NOT use the PubMed wordmark or the PubMed logo in association or in connection with user's or any other party's product or service.
- NOT adopt, use, or seek to register any mark or trade name confusingly similar to or suggestive of the PubMed wordmark or PubMed logo
- NOT to indicate or imply that NLM/NIH/HHS has endorsed its products/services/applications.
Users who republish or redistribute the data (services, products or raw data) agree to:
- maintain the most current version of all distributed data, or
- make known in a clear and conspicuous manner that the products/services/applications do not reflect the most current/accurate data available from NLM.
These data are produced with a reasonable standard of care, but NLM makes no warranties express or implied, including no warranty of merchantability or fitness for particular purpose, regarding the accuracy or completeness of the data. Users agree to hold NLM and the U.S. Government harmless from any liability resulting from errors in the data. NLM disclaims any liability for any consequences due to use, misuse, or interpretation of information contained or not contained in the data.
NLM does not provide legal advice regarding copyright, fair use, or other aspects of intellectual property rights. See the NLM Copyright page: https://www.nlm.nih.gov/web_policies.html#copyright
NLM reserves the right to change the type and format of its machine-readable data. NLM will take reasonable steps to inform users of any changes to the format of the data before the data are distributed via the announcement section or subscription to email and RSS updates.
The PubMed wordmark and the PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.
### Citation Information
[Courtesy of the U.S. National Library of Medicine](https://www.nlm.nih.gov/databases/download/terms_and_conditions.html).
### Contributions
Thanks to [@Narsil](https://github.com/Narsil) for adding this dataset.
|
mesolitica/kesalahan-tatabahasa-choice | ---
license: mit
language:
- ms
---
# Kesalahan Tatabahasa Choice
Notebook at https://github.com/mesolitica/malaysian-dataset/tree/master/tatabahasa/qa-choice |
hemachandher/idefics_dataset | ---
dataset_info:
features:
- name: image
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2103
num_examples: 2
download_size: 8686
dataset_size: 2103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nilq/babylm-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 569602489
num_examples: 10587551
- name: validation
num_bytes: 55093483
num_examples: 1026747
- name: test
num_bytes: 60175255
num_examples: 1054646
download_size: 429629738
dataset_size: 684871227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
language:
- en
pretty_name: BabyLM 100M
---
# BabyLM 100M
This curated dataset is originally from the [BabyLM Challenge](https://babylm.github.io/guidelines.html).
It consists of ~100M words of mixed domain, consisting of the following sources:
- CHILDES (child-directed speech)
- Subtitles (speech)
- BNC (speech)
- TED talks (speech)
- children's books (simple written language) |
brandolorian/TinyQuestions | ---
license: mit
---
Dataset Name: TinyQuestions
Dataset Description:
The TinyQuestions dataset is a derivative of the original TinyStories dataset. It consists of short stories that have been modified to include special tokens to mark questions and their corresponding answers. The dataset is designed for natural language processing tasks that involve answer-questioning.
Derived from:
This dataset is derived from the TinyStories dataset available on Huggingface.
Format:
The dataset is available in CSV format, with each row representing a story. The stories are modified to include the following special tokens:
`<QUESTION-START>`: Marks the beginning of a question within the story.
`<QUESTION-END>`: Marks the end of a question within the story.
Examples:
One day, a little girl named Lily found a needle in her room. She knew it was difficult to play with it because it was sharp. Lily wanted to share the needle with her mom, so she could sew a button on her shirt. Lily went to her mom and said, "<QUESTION-START> Can you share it with me and sew my shirt?<QUESTION-END>"
One day, a little fish named Fin was swimming near the shore. He saw a big crab and wanted to be friends. "Hi, I am Fin.<QUESTION-START> Do you want to play?<QUESTION-END>"
Once upon a time, there was a clever little dog named Max. Max loved to run and play with his friends in the park. One day, Max was running very fast when he fell and hurt his knee. Max went to his friend, the wise old owl, and said, "Owl, my knee hurts.<QUESTION-START> What can I do?<QUESTION-END>" |
mirajbhandari/mistral_dataset_for_tuning | ---
license: mit
---
|
Aj901842/ADPSS | ---
license: openrail
---
|
kailashsp/ironman-armor | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1730684.0
num_examples: 149
download_size: 1724906
dataset_size: 1730684.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Violetmae14/autotrain-data-inanimate-insanity-text-to-animation-video | ---
license: bigscience-openrail-m
task_categories:
- token-classification
language:
- en
pretty_name: Keegan Kirby
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mwz/sindhi_alpaca_yc_filtered | ---
license: mit
dataset_info:
features:
- name: sindhi_instruction
dtype: string
- name: sindhi_input
dtype: string
- name: sindhi_output
dtype: string
splits:
- name: train
num_bytes: 42921578.7
num_examples: 26019
- name: test
num_bytes: 4769064.3
num_examples: 2891
download_size: 22162387
dataset_size: 47690643.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mwinn99/GPL6887 | ---
license: odbl
tags:
- biology
size_categories:
- 10K<n<100K
---
Original, raw data can be found in Gene Expression Omnibus (GEO) https://www.ncbi.nlm.nih.gov/geo/
## Citation
Winnicki MJ, Brown CA, Porter HL, Giles CB, Wren JD, BioVDB: biological vector database for high-throughput gene expression meta-analysis, Frontiers in Artificial Intelligence 7 (2024)
https://www.frontiersin.org/articles/10.3389/frai.2024.1366273 |
DiegoRoberto10/diego12 | ---
license: openrail
---
|
MITCriticalData/cloud2cloudless_dataset_5_municipalities | ---
license: mit
---
# Creating Cloud-Cloudless Paired Dataset
To generate the Cloud-Cloudless Paired Dataset, we utilize an existing dataset that encompasses imagery from five municipalities in Colombia. This dataset is structured with each municipality containing 165 images, acquired through the satellite_extractor API and based on SentinelHub, spanning across 12 different channels. Within each municipality, we have meticulously identified the optimal cloudless image and stored the corresponding names in a dictionary called `cloudless_groundtruths`.
The primary objective is to subtract this specific cloudless image from the set of 165 images, resulting in 164 images per municipality. Subsequently, each of these 164 images will be paired with the previously identified cloudless image. Consequently, this process creates a total of `164 * 2 * NUM_MUNICIPALITIES`, yielding 1640 images or 820 image pairs in total.
To facilitate this dataset creation, we have introduced the class `Cloud2CloudlesDataset`. This class replicates each corresponding ground truth for the 164 images in each municipality, storing every paired set in a newly designated folder named `DATASET`.
Originally, the images were formatted as `image_DD%%MM%%YY`. As part of the dataset creation process, we will rename these images to `image_DD%%MM%%YY_gt` for the ground truth image and `image_DD%%MM%%YY_cloud` for the image with clouds.
Upon initialization, the class requires the path to the source dataset, which contains raw images for each municipality organized in N folders, and the final path where the new dataset will be stored.
The implementation includes thorough testing to verify the number of images, ensuring that the final count aligns with the total number of images encountered in the source folder path. Additionally, one of the functions within this class ensures the existence of each folder in the specified destination path, guaranteeing a well-organized and comprehensive Cloud-Cloudless Paired Dataset.
Github code [here](https://github.com/sebasmos/satellite.extractor/blob/main/notebooks/create_Cloud2CloudlesDataset.ipynb): |
FinchResearch/pallas_splitted_18c | ---
license: apache-2.0
task_categories:
- text-classification
- question-answering
- conversational
- text-generation
language:
- en
tags:
- language
- multipurpose
- nlp
--- |
one-sec-cv12/chunk_98 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23567105664.0
num_examples: 245368
download_size: 21584566260
dataset_size: 23567105664.0
---
# Dataset Card for "chunk_98"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ammar-Azman/crawl-mufti_wilayah | ---
license: mit
---
👉 Dataset source: https://www.muftiwp.gov.my/ |
Falah/fox_2_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 3602
num_examples: 12
download_size: 4842
dataset_size: 3602
---
# Dataset Card for "fox_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r2_a64 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r2_a64
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r2_a64](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r2_a64)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r2_a64\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:41:13.717552](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r2_a64/blob/main/results_2024-02-10T00-41-13.717552.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5515960902251136,\n\
\ \"acc_stderr\": 0.03366098004700812,\n \"acc_norm\": 0.5572141751663529,\n\
\ \"acc_norm_stderr\": 0.03438109302311316,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826831,\n \"mc2\": 0.37409967945900374,\n\
\ \"mc2_stderr\": 0.013681044022204396\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6152160924118701,\n\
\ \"acc_stderr\": 0.0048554983433083876,\n \"acc_norm\": 0.8199561840270863,\n\
\ \"acc_norm_stderr\": 0.003834387002270879\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n \
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.0416345303130286,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.0416345303130286\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6741935483870968,\n \"acc_stderr\": 0.026662010578567107,\n \"\
acc_norm\": 0.6741935483870968,\n \"acc_norm_stderr\": 0.026662010578567107\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.02533900301010651,\n \
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.02533900301010651\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.735632183908046,\n\
\ \"acc_stderr\": 0.015769984840690515,\n \"acc_norm\": 0.735632183908046,\n\
\ \"acc_norm_stderr\": 0.015769984840690515\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546665,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546665\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.015268677317602288,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.015268677317602288\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n\
\ \"acc_stderr\": 0.012573836633799011,\n \"acc_norm\": 0.41264667535853977,\n\
\ \"acc_norm_stderr\": 0.012573836633799011\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826831,\n \"mc2\": 0.37409967945900374,\n\
\ \"mc2_stderr\": 0.013681044022204396\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24184988627748294,\n \
\ \"acc_stderr\": 0.011794861371318703\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r2_a64
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-41-13.717552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-41-13.717552.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- '**/details_harness|winogrande|5_2024-02-10T00-41-13.717552.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-41-13.717552.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_41_13.717552
path:
- results_2024-02-10T00-41-13.717552.parquet
- split: latest
path:
- results_2024-02-10T00-41-13.717552.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r2_a64
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r2_a64](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r2_a64) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r2_a64",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:41:13.717552](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r2_a64/blob/main/results_2024-02-10T00-41-13.717552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5515960902251136,
"acc_stderr": 0.03366098004700812,
"acc_norm": 0.5572141751663529,
"acc_norm_stderr": 0.03438109302311316,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826831,
"mc2": 0.37409967945900374,
"mc2_stderr": 0.013681044022204396
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946702
},
"harness|hellaswag|10": {
"acc": 0.6152160924118701,
"acc_stderr": 0.0048554983433083876,
"acc_norm": 0.8199561840270863,
"acc_norm_stderr": 0.003834387002270879
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.0416345303130286,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.0416345303130286
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.02533900301010651,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.02533900301010651
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.735632183908046,
"acc_stderr": 0.015769984840690515,
"acc_norm": 0.735632183908046,
"acc_norm_stderr": 0.015769984840690515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546665,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546665
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602288,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602288
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799011,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799011
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826831,
"mc2": 0.37409967945900374,
"mc2_stderr": 0.013681044022204396
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.24184988627748294,
"acc_stderr": 0.011794861371318703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alexandrainst/domsdatabasen | ---
dataset_info:
features:
- name: case_id
dtype: string
- name: Overskrift
dtype: string
- name: Afgørelsesstatus
dtype: string
- name: Faggruppe
dtype: string
- name: Ret
dtype: string
- name: Rettens sagsnummer
dtype: string
- name: Sagstype
dtype: string
- name: Instans
dtype: string
- name: Domsdatabasens sagsnummer
dtype: string
- name: Sagsemner
dtype: string
- name: Særlige retsskridt
dtype: string
- name: Sagsdeltagere
dtype: string
- name: Dørlukning
dtype: string
- name: Løftet ud af småsagsprocessen
dtype: string
- name: Anerkendelsespåstand
dtype: string
- name: Politiets journalnummer
dtype: string
- name: Påstandsbeløb
dtype: string
- name: Sagskomplekser
dtype: string
- name: text
dtype: string
- name: text_anonymized
dtype: string
- name: text_len
dtype: int64
- name: text_anon_len
dtype: int64
splits:
- name: train
num_bytes: 193593176
num_examples: 3917
download_size: 96435472
dataset_size: 193593176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "domsdatabasen"
## Dataset Description
- **Point of Contact:** [Oliver Kinch](mailto:oliver.kinch@alexandra.dk)
- **Size of dataset:** 199 MB
### Dataset Summary
[Domsdatabasen](https://domsdatabasen.dk/) is a database where you can find and read selected judgments delivered by the Danish Courts.
Each judgment/case consists of tabular data and a case-descriptive PDF. This dataset collects all these cases, with each sample describing a specific judgment/case.
The PDFs are anonymized to protect sensitive information. Therefore, each sample includes two text versions:
- `text_anon` (with anonymization tags: \<anonym\>"Some sensitive text"\</anonym\>).
- `text` (without anonymization tags).
`text_anon` is read with [Easyocr](https://github.com/JaidedAI/EasyOCR).
`text` is read with [Easyocr](https://github.com/JaidedAI/EasyOCR) or [Tika-python](https://github.com/chrismattmann/tika-python)
depending on the PDF and the anonymization method used.
`text_anon` will be empty if no anonymization is detected in the PDF.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
An example from the dataset looks as follows.
```
{
"case_id": "id of case/judgment",
... The tabualar string data ...,
"text": "pdf text",
"text_anon": "anonymized pdf text"
"text_len": <number of chars in text>,
"text_anon_len": <number of chars in anonymized text>
}
```
### Data Fields
- `case_id`: a `string` feature.
- `text`: a `string` feature.
- `text_anon`: a `string` feature.
- `text_len`: an `int` feature.
- `text_anon_len`: an `int` feature.
### Dataset Statistics
#### Size of dataset
With the PDF texts being provided in two versions, `text` and `text_anon`, the total size of all PDF texts is approximately ~199//2 MB.
#### Number of samples
- 3919
#### PDF Text Length Distribution
Statistics based on `text`.
- Minimum length: 192
- Maximum length: 2101736

## Potential Dataset Issues
See [open issues](https://github.com/oliverkinch/doms_databasen/issues).
## Dataset Creation
### Curation Rationale
There are not many large-scale law datasets in Danish.
### Source Data
The dataset has been scraped from [Domsdatabasen](https://domsdatabasen.dk/).
## Additional Information
### Dataset Curators
[Oliver Kinch](https://huggingface.co/oliverkinch) from the [The Alexandra
Institute](https://alexandra.dk/)
### Licensing Information
The dataset is licensed under the [CC0
license](https://creativecommons.org/share-your-work/public-domain/cc0/). |
LEAP/ClimSim_low-res_aqua-planet | ---
license: cc-by-4.0
---
Corresponding GitHub repo can be found here:
https://github.com/leap-stc/ClimSim
Read more: https://arxiv.org/abs/2306.08754. |
chemNLP/clinical-trials-v2 | ---
dataset_info:
features:
- name: filename
dtype: string
- name: xml
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 12800535347
num_examples: 456224
download_size: 3738991719
dataset_size: 12800535347
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xjlulu/ntu_adl_slot | ---
license: apache-2.0
task_categories:
- token-classification
language:
- en
--- |
open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B | ---
pretty_name: Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B](https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T11:27:51.194235](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B/blob/main/results_2024-02-14T11-27-51.194235.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23082125681978313,\n\
\ \"acc_stderr\": 0.02986949959492494,\n \"acc_norm\": 0.23087014880014953,\n\
\ \"acc_norm_stderr\": 0.030656588530011887,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48654521547048707,\n\
\ \"mc2_stderr\": 0.01630952029889674\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2150170648464164,\n \"acc_stderr\": 0.012005717634133611,\n\
\ \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768407\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25652260505875324,\n\
\ \"acc_stderr\": 0.004358210689442257,\n \"acc_norm\": 0.2523401712806214,\n\
\ \"acc_norm_stderr\": 0.00433467695270386\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066656,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066656\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.026480357179895678,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.026480357179895678\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.030299574664788137,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.030299574664788137\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560493,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560493\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\
\ \"acc_stderr\": 0.015104550008905706,\n \"acc_norm\": 0.23243933588761176,\n\
\ \"acc_norm_stderr\": 0.015104550008905706\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n\
\ \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n\
\ \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931593,\n \"mc2\": 0.48654521547048707,\n\
\ \"mc2_stderr\": 0.01630952029889674\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616445\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-27-51.194235.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- '**/details_harness|winogrande|5_2024-02-14T11-27-51.194235.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T11-27-51.194235.parquet'
- config_name: results
data_files:
- split: 2024_02_14T11_27_51.194235
path:
- results_2024-02-14T11-27-51.194235.parquet
- split: latest
path:
- results_2024-02-14T11-27-51.194235.parquet
---
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B](https://huggingface.co/TW3PartnersLLM/TW3-v2-AlpacaSmaug-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:27:51.194235](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v2-AlpacaSmaug-72B/blob/main/results_2024-02-14T11-27-51.194235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23082125681978313,
"acc_stderr": 0.02986949959492494,
"acc_norm": 0.23087014880014953,
"acc_norm_stderr": 0.030656588530011887,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48654521547048707,
"mc2_stderr": 0.01630952029889674
},
"harness|arc:challenge|25": {
"acc": 0.2150170648464164,
"acc_stderr": 0.012005717634133611,
"acc_norm": 0.257679180887372,
"acc_norm_stderr": 0.012780770562768407
},
"harness|hellaswag|10": {
"acc": 0.25652260505875324,
"acc_stderr": 0.004358210689442257,
"acc_norm": 0.2523401712806214,
"acc_norm_stderr": 0.00433467695270386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066656,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066656
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.026480357179895678,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.026480357179895678
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788137,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788137
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560493,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560493
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905706,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.021823422857744953,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.021823422857744953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931593,
"mc2": 0.48654521547048707,
"mc2_stderr": 0.01630952029889674
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616445
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/med_alpaca_standardized_cluster_51_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 30760374
num_examples: 48003
download_size: 14729299
dataset_size: 30760374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_51_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amphora/rewrite-se-quant | ---
dataset_info:
features:
- name: link
dtype: string
- name: query
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 51848272
num_examples: 21950
download_size: 26992504
dataset_size: 51848272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
faisalq/AraPoems | ---
license: cc-by-nc-4.0
---
|
DBQ/Bottega.Veneta.Product.prices.United.States | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United States - Bottega Veneta - Product-level price list
tags:
- webscraping
- ecommerce
- Bottega Veneta
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1614857
num_examples: 4469
download_size: 452053
dataset_size: 1614857
---
# Bottega Veneta web scraped data
## About the website
Bottega Veneta is a prominent player in the **Luxury Fashion** industry in the **Americas**, particularly in the **United States**. This industry is defined by high-end clothing, accessories, leather goods, shoes and lifestyle items from distinctive brands. In the US, the Luxury Fashion market is shaped by trends like digitalization, personalized experiences, and sustainability. A significant amount of retail activity occurs on digital platforms. The dataset observed presents **Ecommerce product-list page (PLP)** data on **Bottega Veneta** in the United States, highlighting the brands online retail presence in the industry. Ecommerce has become increasingly important in the Luxury Fashion sector as a direct-to-consumer avenue.
## Link to **dataset**
[United States - Bottega Veneta - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Bottega%20Veneta%20Product-prices%20United%20States/r/recZ249aYzkZQZLPx)
|
steven2521/squad_v2_rag_qa | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
sequence: int64
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: question_embedding
sequence: float32
splits:
- name: train
num_bytes: 820791044
num_examples: 130319
- name: validation
num_bytes: 75187085
num_examples: 11873
download_size: 966385539
dataset_size: 895978129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
EgilKarlsen/Spirit_BERT_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650065.625
num_examples: 37500
- name: test
num_bytes: 38550020.0
num_examples: 12500
download_size: 211761700
dataset_size: 154200085.625
---
# Dataset Card for "Spirit_BERT_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NUHA-AI/ply-assets-public | ---
license: mit
---
|
guilgautier/dkt-images | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 62581.0
num_examples: 8
download_size: 59096
dataset_size: 62581.0
---
|
ParthGohil19/llama2-DS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 688
num_examples: 172
download_size: 714
dataset_size: 688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2-DS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/vr_train_free_23 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6228869606
num_examples: 10000
download_size: 1004570134
dataset_size: 6228869606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/leto_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of leto/烈夏 (Arknights)
This is the dataset of leto/烈夏 (Arknights), containing 37 images and their tags.
The core tags of this character are `animal_ears, bear_ears, multicolored_hair, streaked_hair, brown_hair, hair_ornament, red_eyes, short_hair, black_hair, white_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 37 | 51.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 37 | 44.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 83 | 88.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leto_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leto_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 37 |  |  |  |  |  | 1girl, long_sleeves, solo, sailor_collar, white_shirt, looking_at_viewer, pleated_skirt, smile, black_jacket, open_jacket, red_neckerchief, midriff, navel, scarf, miniskirt, open_mouth, simple_background, crop_top, serafuku, white_background, blue_skirt, thighhighs, black_gloves, fingerless_gloves, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | sailor_collar | white_shirt | looking_at_viewer | pleated_skirt | smile | black_jacket | open_jacket | red_neckerchief | midriff | navel | scarf | miniskirt | open_mouth | simple_background | crop_top | serafuku | white_background | blue_skirt | thighhighs | black_gloves | fingerless_gloves | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:----------------|:--------------|:--------------------|:----------------|:--------|:---------------|:--------------|:------------------|:----------|:--------|:--------|:------------|:-------------|:--------------------|:-----------|:-----------|:-------------------|:-------------|:-------------|:---------------|:--------------------|:----------|
| 0 | 37 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
duduvorpagel/flokis | ---
license: openrail
---
|
defector/autotrain-data-company | ---
language:
- en
---
# AutoTrain Dataset for project: company
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project company.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"sahil",
"prasad",
"president",
"www",
"swimcentre",
"com",
"banik",
"baalkrishan",
"gandhi",
"com",
"no",
"satish",
"nagar",
"hisar"
],
"tags": [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
]
},
{
"tokens": [
"olivia",
"wilson",
"real",
"estate",
"agent",
"reallygreatsite",
"com",
"anywhere",
"st",
"any",
"city",
"st",
"www",
"reallygreatsite",
"com"
],
"tags": [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(num_classes=2, names=['0', '9'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 999651 |
| valid | 499630 |
|
ArtifactAI/arxiv_python_research_code | ---
dataset_info:
features:
- name: repo
dtype: string
- name: file
dtype: string
- name: code
dtype: string
- name: file_length
dtype: int64
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: extension_type
dtype: string
splits:
- name: train
num_bytes: 12984199778
num_examples: 1415924
download_size: 4073853616
dataset_size: 12984199778
license: bigcode-openrail-m
task_categories:
- text-generation
language:
- en
pretty_name: arxiv_python_research_code
size_categories:
- 1B<n<10B
---
# Dataset Card for "ArtifactAI/arxiv_python_research_code"
## Dataset Description
https://huggingface.co/datasets/ArtifactAI/arxiv_python_research_code
### Dataset Summary
ArtifactAI/arxiv_python_research_code contains over 4.13GB of source code files referenced strictly in ArXiv papers. The dataset serves as a curated dataset for Code LLMs.
### How to use it
```python
from datasets import load_dataset
# full dataset (4.13GB of data)
ds = load_dataset("ArtifactAI/arxiv_python_research_code", split="train")
# dataset streaming (will only download the data as needed)
ds = load_dataset("ArtifactAI/arxiv_python_research_code", streaming=True, split="train")
for sample in iter(ds): print(sample["code"])
```
## Dataset Structure
### Data Instances
Each data instance corresponds to one file. The content of the file is in the `code` feature, and other features (`repo`, `file`, etc.) provide some metadata.
### Data Fields
- `repo` (string): code repository name.
- `file` (string): file path in the repository.
- `code` (string): code within the file.
- `file_length`: (integer): number of characters in the file.
- `avg_line_length`: (float): the average line-length of the file.
- `max_line_length`: (integer): the maximum line-length of the file.
- `extension_type`: (string): file extension.
### Data Splits
The dataset has no splits and all data is loaded as train split by default.
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
34,099 active GitHub repository names were extracted from [ArXiv](https://arxiv.org/) papers from its inception through July 21st, 2023 totaling 773G of compressed github repositories.
These repositories were then filtered, and the code from each '.py' file extension was extracted into 1.4 million files.
#### Who are the source language producers?
The source (code) language producers are users of GitHub that created unique repository
### Personal and Sensitive Information
The released dataset may contain sensitive information such as emails, IP addresses, and API/ssh keys that have previously been published to public repositories on GitHub.
## Additional Information
### Dataset Curators
Matthew Kenney, Artifact AI, matt@artifactai.com
### Citation Information
```
@misc{arxiv_python_research_code,
title={arxiv_python_research_code},
author={Matthew Kenney},
year={2023}
}
``` |
open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b | ---
pretty_name: Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-llama2-luban-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T17:51:55.747438](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b/blob/main/results_2023-10-16T17-51-55.747438.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006921140939597316,\n\
\ \"em_stderr\": 0.0008490247804930292,\n \"f1\": 0.11193687080536992,\n\
\ \"f1_stderr\": 0.0020523308364626394,\n \"acc\": 0.4264965386587744,\n\
\ \"acc_stderr\": 0.009679849375871168\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006921140939597316,\n \"em_stderr\": 0.0008490247804930292,\n\
\ \"f1\": 0.11193687080536992,\n \"f1_stderr\": 0.0020523308364626394\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \
\ \"acc_stderr\": 0.007552338527716947\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025388\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|arc:challenge|25_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T17_51_55.747438
path:
- '**/details_harness|drop|3_2023-10-16T17-51-55.747438.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T17-51-55.747438.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T17_51_55.747438
path:
- '**/details_harness|gsm8k|5_2023-10-16T17-51-55.747438.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T17-51-55.747438.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hellaswag|10_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T05:54:43.169153.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T05:54:43.169153.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T17_51_55.747438
path:
- '**/details_harness|winogrande|5_2023-10-16T17-51-55.747438.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T17-51-55.747438.parquet'
- config_name: results
data_files:
- split: 2023_09_01T05_54_43.169153
path:
- results_2023-09-01T05:54:43.169153.parquet
- split: 2023_10_16T17_51_55.747438
path:
- results_2023-10-16T17-51-55.747438.parquet
- split: latest
path:
- results_2023-10-16T17-51-55.747438.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-llama2-luban-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T17:51:55.747438](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b/blob/main/results_2023-10-16T17-51-55.747438.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930292,
"f1": 0.11193687080536992,
"f1_stderr": 0.0020523308364626394,
"acc": 0.4264965386587744,
"acc_stderr": 0.009679849375871168
},
"harness|drop|3": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930292,
"f1": 0.11193687080536992,
"f1_stderr": 0.0020523308364626394
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.007552338527716947
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025388
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KushT/reuters-21578-train-val-test | ---
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- text-classification
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 10816829
num_examples: 6988
- name: validation
num_bytes: 1178067
num_examples: 781
- name: test
num_bytes: 4513694
num_examples: 3019
download_size: 5088303
dataset_size: 16508590
language:
- en
---
Dataset from [Kaggle](https://www.kaggle.com/datasets/nltkdata/reuters/code)
The split is done on the training set using ```iterative_train_test_split``` from [scikit-multilearn](http://scikit.ml/index.html)
There are the following 90 labels.
'interest',
'groundnut-oil',
'potato',
'palmkernel',
'sun-meal',
'lei',
'cotton-oil',
'sunseed',
'sorghum',
'barley',
'dlr',
'groundnut',
'wpi',
'strategic-metal',
'livestock',
'l-cattle',
'lin-oil',
'gold',
'fuel',
'nzdlr',
'oat',
'soybean',
'hog',
'tin',
'lumber',
'bop',
'soy-oil',
'dfl',
'nkr',
'gas',
'carcass',
'silver',
'coffee',
'gnp',
'crude',
'rapeseed',
'alum',
'copper',
'housing',
'grain',
'cocoa',
'sun-oil',
'rice',
'jobs',
'rubber',
'jet',
'tea',
'retail',
'ship',
'corn',
'meal-feed',
'naphtha',
'sugar',
'rand',
'platinum',
'money-supply',
'yen',
'nickel',
'income',
'cpu',
'copra-cake',
'instal-debt',
'coconut-oil',
'cotton',
'rye',
'palm-oil',
'acq',
'wheat',
'propane',
'dmk',
'reserves',
'rape-oil',
'money-fx',
'heat',
'ipi',
'castor-oil',
'earn',
'iron-steel',
'palladium',
'coconut',
'veg-oil',
'nat-gas',
'pet-chem',
'lead',
'trade',
'cpi',
'oilseed',
'zinc',
'soy-meal',
'orange' |
NbAiLab/norwegian-alpaca | ---
license: cc-by-4.0
language:
- 'no'
- nb
tags:
- instruction-finetuning
pretty_name: NB Alpaca Norwegian Bokmål
task_categories:
- text-generation
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction_en
dtype: string
- name: input_en
dtype: string
- name: output_en
dtype: string
splits:
- name: train
num_bytes: 38067492
num_examples: 51942
download_size: 24204487
dataset_size: 38067492
---
# NB Alpaca Norwegian Bokmål
This dataset is a translation to Norwegian Bokmål of [alpaca_data_cleaned.json](https://github.com/tloen/alpaca-lora/blob/main/alpaca_data_cleaned.json), a clean version of the [Alpaca dataset made at Stanford](https://huggingface.co/datasets/tatsu-lab/alpaca).
An [earlier version](https://huggingface.co/datasets/NbAiLab/norwegian-alpaca/tree/main/nllb) used [Facebook's NLLB 1.3B model](https://huggingface.co/facebook/nllb-200-1.3B), but the current version uses OpenAI's `gpt-3.5-turbo`, hence this dataset cannot be used to create models that compete in any way against OpenAI. |
carlosejimenez/wikipedia-20220301.en-block-size-1024 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: validation
num_bytes: 301864191
num_examples: 21817
- name: train
num_bytes: 60558566627
num_examples: 4368542
download_size: 20321590769
dataset_size: 60860430818
---
# Dataset Card for "wikipedia-20220301.en-block-size-1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DhimanBose/Bangla_Masked_Language_Model_dataset_preprocessed | ---
language:
- bn
size_categories:
- 1M<n<10M
task_categories:
- text-generation
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: word_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 14852870908
num_examples: 5207879
download_size: 3451024663
dataset_size: 14852870908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_demonstrative_no_number | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 70321
num_examples: 322
- name: test
num_bytes: 567920
num_examples: 2682
- name: train
num_bytes: 605496
num_examples: 2663
download_size: 706442
dataset_size: 1243737
---
# Dataset Card for "MULTI_VALUE_qqp_demonstrative_no_number"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Afonsovoz | ---
license: openrail
---
|
godoyj/wikilingua | ---
language:
- pt
task_categories:
- summarization
--- |
aicyd/gov_report_sshort | ---
license: apache-2.0
---
|
NobreJooj/Vinni | ---
license: openrail
---
|
psroy/mini-platypus-scienceqa-one | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 669431
num_examples: 1000
download_size: 293121
dataset_size: 669431
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datahrvoje/twitter_dataset_1713156674 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24816
num_examples: 56
download_size: 12593
dataset_size: 24816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Q-b1t/CVE_NORMALIZED_DESCRIPTION_CVSS_MAPPING | ---
license: mit
---
|
carlosejimenez/qqp_corpus_trainval | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: text
dtype: string
splits:
- name: test
num_bytes: 52434356
num_examples: 390965
- name: train
num_bytes: 53724642
num_examples: 404276
- name: validation
num_bytes: 5370744
num_examples: 40430
download_size: 50205619
dataset_size: 111529742
---
# Dataset Card for "qqp_corpus_trainval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aryanmehta5902/doctest2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30175354
num_examples: 1001
download_size: 8247741
dataset_size: 30175354
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dipteshkanojia/llama-2-qe-2023-enmr-da-sys-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 657819
num_examples: 1086
download_size: 281499
dataset_size: 657819
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- mr
---
# Dataset Card for "llama-2-qe-2023-enmr-da-sys-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DGurgurov/indonesian_sa | ---
license: mit
---
## Sentiment Analysis Data for the Indonesian Language
**Dataset Description:**
This dataset contains a sentiment analysis data from Purwarianti et al. (2019).
**Data Structure:**
The data was used for the project on [injecting external commonsense knowledge into multilingual Large Language Models](https://github.com/d-gurgurov/Injecting-Commonsense-Knowledge-into-LLMs).
**Citation:**
```bibtex
@inproceedings{purwarianti2019improving,
title={Improving bi-lstm performance for indonesian sentiment analysis using paragraph vector},
author={Purwarianti, Ayu and Crisdayanti, Ida Ayu Putu Ari},
booktitle={2019 International Conference of Advanced Informatics: Concepts, Theory and Applications (ICAICTA)},
pages={1--5},
year={2019},
organization={IEEE}
}
``` |
RamanBola/TherapistConversation | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2831545
num_examples: 2129
download_size: 1453599
dataset_size: 2831545
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kaluniano12/LEANDRINHO_GAMEPLAYS1 | ---
license: openrail
---
|
LipengCS/Table-GPT | ---
pretty_name: Table-GPT (Table-tuned GPT for Diverse Table Tasks)
configs:
- config_name: All
data_files:
- split: train
path: "train/train_All.jsonl"
- split: test
path: "test/test_All.jsonl"
- config_name: ColumnAugmentation
data_files:
- split: train
path: "train/train_ColumnAugmentation.jsonl"
- config_name: ColumnFinding
data_files:
- split: test
path: "test/test_ColumnFinding.jsonl"
- config_name: ColumnTypeAnnotation
data_files:
- split: test
path: "test/test_ColumnTypeAnnotation.jsonl"
- config_name: DataImputation
data_files:
- split: train
path: "train/train_DataImputation.jsonl"
- split: test
path: "test/test_DataImputation.jsonl"
- config_name: EntityMatching
data_files:
- split: train
path: "train/train_EntityMatching.jsonl"
- split: test
path: "test/test_EntityMatching.jsonl"
- config_name: ErrorDetection
data_files:
- split: train
path: "train/train_ErrorDetection.jsonl"
- split: test
path: "test/test_ErrorDetection.jsonl"
- config_name: HeaderValueMatching
data_files:
- split: train
path: "train/train_HeaderValueMatching.jsonl"
- config_name: ListExtraction
data_files:
- split: train
path: "train/train_ListExtraction.jsonl"
- config_name: MissingValueIdentification
data_files:
- split: test
path: "test/test_MissingValueIdentification.jsonl"
- config_name: NL2SQL
data_files:
- split: train
path: "train/train_NL2SQL.jsonl"
- config_name: Row2RowTransform
data_files:
- split: train
path: "train/train_Row2RowTransform.jsonl"
- split: test
path: "test/test_Row2RowTransform.jsonl"
- config_name: RowAugmentation
data_files:
- split: train
path: "train/train_RowAugmentation.jsonl"
- config_name: RowColumnFiltering
data_files:
- split: train
path: "train/train_RowColumnFiltering.jsonl"
- config_name: RowColumnSorting
data_files:
- split: train
path: "train/train_RowColumnSorting.jsonl"
- config_name: RowColumnSwapping
data_files:
- split: train
path: "train/train_RowColumnSwapping.jsonl"
- config_name: SchemaMatching
data_files:
- split: train
path: "train/train_SchemaMatching.jsonl"
- split: test
path: "test/test_SchemaMatching.jsonl"
- config_name: TableQuestion
data_files:
- split: test
path: "test/test_TableQuestion.jsonl"
- config_name: TableSummarization
data_files:
- split: train
path: "train/train_TableSummarization.jsonl"
---
# Table-GPT: Table-tuned GPT for Diverse Table Tasks
This repository contains training and test datasets for the SIGMOD'24 paper [Table-GPT: Table-tuned GPT for Diverse Table Tasks](https://arxiv.org/abs/2310.09263). The source code for data generation and task evaluation are available here: https://github.com/LiPengCS/Table-GPT.
## Task Descriptions
We collect (or synthesize) 18 diverse table-related tasks, which are summarized in the table below. There are 14 training tasks (T5 - T18) and 9 test tasks (T1 - T9). Some of these tasks (T-1 to T-4) are used as unseen hold-out tasks, to evaluate Table-GPT ability to generalize to completely new and unseen tasks. Some of these tasks (T-10 to T-18) are used for training only.
**Task Name** | **Task Description** | **Task Category** | **Train/Test**
----------------------------------------|--------------------------------------------------------------------------------------|---------------------|----------------
T-1: Missing-value identification (MV) | Identify the row and column position of the only missing cell in a given table | Table understanding | Test only
T-2: Column-finding (CF) | Identify the column-name of a specific value that appears only once in a given table | Table Understanding | Test only
T-3: Table-QA (TQA) | Answer a natural-language question based on the content of a table | Table QA | Test only
T-4: Column type annotation (CTA) | Find the semantic type of a column, from a given list of choices | Table understanding | Test only
T-5: Row-to-row transform (R2R) | Transform table data based on input/output examples | Data transformation | Train/Test
T-6: Entity matching (EM) | Match rows from two tables that refer to the same real-world entity | Table matching | Train/Test
T-7: Schema matching (SM) | Match columns from two tables that refer to the same meaning | Table matching | Train/Test
T-8: Data imputation (DI) | Predict the missing values in a cell based on the table context | Data cleaning | Train/Test
T-9: Error detection (ED) | Detect data values in a table that is a likely error from misspelling | Data cleaning | Train/Test
T-10: List extraction (LE) | Extract a structured table, from a list that lacks explicit column delimiters | Data transformation | Train only
T-11: Header value matching (HVM) | Match column-headers with its data values drawn from the same table | Table matching | Train only
T-12: Natural-language to SQL (NS) | Translate a natural-language question on a table into a SQL query | NL-to-SQL | Train only
T-13: Table summarization (TS) | Produce a natural-language summary for the content in a table | Data augmentation | Train only
T-14: Column augmentation (CA) | Augment a table with additional columns compatible with a given table | Data augmentation | Train only
T-15: Row augmentation (RA) | Augment a table with additional rows compatible with a given table | Data augmentation | Train only
T-16: Row/column swapping (RCSW) | Manipulate a given table, by swapping the position of two rows or columns | Table manipulation | Train only
T-17: Row/column filtering (RCF) | Manipulate a given table, by filtering on given rows or columns | Table manipulation | Train only
T-18: Row/column sorting (RCS) | Manipulate a given table, by performing sorting on given rows or columns | Table manipulation | Train only
## Structure
### Repository Structure
The structure of this repository is shown as follows.
```
Table-GPT
├── train
│ ├── train_All.jsonl # the merged training data of all training tasks
│ ├── train_{task_name}.csv # the training data for a specific training task
│ └── ...
│
└── test
├── test_All.jsonl # the merged test data of all test tasks
├── test_{task_name}.csv # the test data for a specific test task
└── ...
```
### Data Structure
Each line in the `.jsonl` file represents a single example, containing the following key items:
- **task**: The name of the task associated with the example.
- **dataset**: The name of the dataset from which the example originates.
- **prompt**: The input prompt provided to the model for generating a response.
- **completion**: The generated output response corresponding to the given prompt.
- **messages**: A list of messages that combine the prompt and completion, typically used in chat-oriented models.
- **metadata**: A dict for other information about the example.
## Dataset
### Training Sets
The datasets used for training tasks are summarized as follows
| **Task** | **Dataset** | **Size** |
|---|---|---|
| ColumnAugmentation | WebPBISynthetic_6 | 1008 |
| DataImputation | WebPBISynthetic_0 | 1414 |
| EntityMatching | 784datasets | 2010 |
| ErrorDetection | WebPBISynthetic_1 | 1494 |
| HeaderValueMatching | WebPBISynthetic_2 | 1954 |
| ListExtraction | WebPBISynthetic_3 | 985 |
| NL2SQL | WikiSQL | 994 |
| Row2RowTransform | Wiki | 951 |
| RowAugmentation | WebPBISynthetic_5 | 971 |
| RowColumnFiltering | WebPBISynthetic_8 | 1048 |
| RowColumnSorting | WebPBISynthetic_7 | 991 |
| RowColumnSwapping | WebPBISynthetic_9 | 1007 |
| SchemaMatching | Web | 2068 |
| TableSummarization | Web | 1014 |
### Test Sets
The datasets used for test tasks are summarized as follows
| **Task** | **Dataset** | **Size** |
|---|---|---|
| ColumnFinding | Spreadsheets-CF | 841 |
| ColumnTypeAnnotation | EfthymiouTest | 1188 |
| ColumnTypeAnnotation | LimayeTest | 348 |
| ColumnTypeAnnotation | SherlockTest | 1940 |
| ColumnTypeAnnotation | T2DTest | 734 |
| DataImputation | Spreadsheets-DI | 2000 |
| EntityMatching | Amazon-Google | 4586 |
| EntityMatching | Beer | 182 |
| EntityMatching | DBLP-ACM | 4946 |
| EntityMatching | DBLP-GoogleScholar | 11484 |
| EntityMatching | Fodors-Zagats | 378 |
| EntityMatching | Walmart-Amazon | 4098 |
| EntityMatching | iTunes-Amazon | 218 |
| ErrorDetection | Spreadsheets-ED-Real | 1740 |
| ErrorDetection | WebTables-ED-Real | 864 |
| MissingValueIdentification | Spreadsheets-MVI-ColumnNoSep | 2000 |
| MissingValueIdentification | Spreadsheets-MVI-ColumnSep | 2000 |
| MissingValueIdentification | Spreadsheets-MVI-RowNoSep | 2000 |
| MissingValueIdentification | Spreadsheets-MVI-RowSep | 2000 |
| Row2RowTransform | BingQL-Other | 102 |
| Row2RowTransform | BingQL-Unit | 99 |
| Row2RowTransform | FF-GR-Trifacta | 134 |
| Row2RowTransform | Headcase | 90 |
| Row2RowTransform | Stackoverflow | 145 |
| SchemaMatching | DeepM | 14 |
| TableQuestion | SQATest | 360 |
| TableQuestion | WikiTest | 4344 |
|
joey234/mmlu-college_biology-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 100543
num_examples: 144
download_size: 60900
dataset_size: 100543
---
# Dataset Card for "mmlu-college_biology-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
floleuerer/OpenSchnabeltier_alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 22593409
num_examples: 21749
download_size: 11357017
dataset_size: 22593409
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
clyu/sg_55k_cleaned_en | ---
dataset_info:
features:
- name: prompt_id
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train_sft
num_bytes: 693725733.3
num_examples: 75278
- name: test_sft
num_bytes: 36511880.7
num_examples: 3962
download_size: 326870876
dataset_size: 730237614.0
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
muhammadhasnain100/dict_3D_house | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 11455332.0
num_examples: 2000
download_size: 3984049
dataset_size: 11455332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_your_yalls | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 573
num_examples: 5
- name: test
num_bytes: 924
num_examples: 10
- name: train
num_bytes: 8394
num_examples: 82
download_size: 9552
dataset_size: 9891
---
# Dataset Card for "MULTI_VALUE_cola_your_yalls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joseluhf11/oct-object-detection-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
splits:
- name: train
num_bytes: 151816462.898
num_examples: 1246
download_size: 71645254
dataset_size: 151816462.898
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oct-object-detection-v2"
Dataset is composed of images with multiples object detection box in coco format (x,y,w,h). Images are OCT (type of eye scaner) with boxes indicating some features associated to AMD disease.
The unique difference from v1 is images are grouped into a single row for the same class detection object.
[Source datataset](https://doi.org/10.1101/2023.03.29.534704) |
ShrinivasSK/en_kn_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 4017735.9
num_examples: 18000
- name: test
num_bytes: 446415.1
num_examples: 2000
download_size: 2392888
dataset_size: 4464151.0
---
# Dataset Card for "data_kn_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jarod0411/3CLPro_5120_6_14 | ---
dataset_info:
features:
- name: DockingScore
dtype: float64
- name: smiles
dtype: string
- name: scaffold_smiles
dtype: string
- name: selfies
dtype: string
- name: scaffold_selfies
dtype: string
- name: sa
dtype: float64
- name: norm_sa
dtype: float64
- name: sol
dtype: float64
- name: norm_sol
dtype: float64
- name: qed
dtype: float64
- name: dock
dtype: float64
- name: norm_dock
dtype: float64
splits:
- name: train
num_bytes: 2669042
num_examples: 5120
download_size: 1158982
dataset_size: 2669042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-21000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1021278
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VivendoDigital/belebele-chat-ita-sft | ---
license: apache-2.0
---
|
rxsmzfg/1 | ---
license: openrail
---
|
vidhikatkoria/SGD_Movies | ---
dataset_info:
features:
- name: domain
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: act
dtype: int64
- name: speaker
dtype: int64
splits:
- name: train
num_bytes: 1808099.5360110803
num_examples: 7219
- name: test
num_bytes: 297
num_examples: 1
download_size: 729887
dataset_size: 1808396.5360110803
---
# Dataset Card for "SGD_Movies"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mekaneeky/Synthetic_English_MMS | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: eng
dtype: string
- name: lug
dtype: string
- name: ach
dtype: string
- name: teo
dtype: string
- name: lgg
dtype: string
- name: nyn
dtype: string
- name: ID
dtype: string
- name: eng_tts
sequence:
sequence: float32
splits:
- name: train
num_bytes: 12857414976
num_examples: 23947
- name: dev
num_bytes: 267728460
num_examples: 500
- name: test
num_bytes: 266636552
num_examples: 500
download_size: 13400072749
dataset_size: 13391779988
---
# Dataset Card for "Synthetic_English_MMS_EL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiehackers/winogrande_debiased-telugu-romanized-nodict | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
- name: qas_id
dtype: int64
splits:
- name: train
num_bytes: 1463187
num_examples: 9248
- name: test
num_bytes: 276255
num_examples: 1767
- name: valid
num_bytes: 199703
num_examples: 1267
download_size: 991176
dataset_size: 1939145
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
katie312/fundus | ---
license: mit
---
|
ekolasky/ResultsIdSet | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 288424
num_examples: 20
download_size: 42786
dataset_size: 288424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sumanth7502/lakme | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 78845.0
num_examples: 9
download_size: 76086
dataset_size: 78845.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
umarigan/turkish_corpus_small | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 6544684803
num_examples: 1500000
download_size: 3575292940
dataset_size: 6544684803
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LuisLenin/Datasetclinicalv2 | ---
license: openrail
task_categories:
- token-classification
pretty_name: Datasetclinicalv2
size_categories:
- n<1K
--- |
FaalSa/dataO | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 9751
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Tanvir1337/quotes | ---
license: cdla-sharing-1.0
pretty_name: Quotes
tags:
- GPT-3.5
- GPT-4
- Claude
- Bard
- Alpaca
- LLaMA
- LLaMA-2
- Vicuna
- PaLM-2
- Mistral-7B
language:
- en
size_categories:
- 1K<n<10K
---
# Quotes [JSON dataset]
A dataset comprising artificially generated **quotes** derived from a diverse array of Large Language Models (LLMs) such as GPT-3.5, GPT-4, Claude, Bard, Alpaca, LLaMA, LLaMA-2, Vicuna, PaLM-2 and Mistral-7B.
## Dataset Contents
The dataset comprises artificially generated quotes, with each quote offering a unique perspective on various topics, accompanied by a title, description, and a designated topic. These quotes are entirely generated by AI and are not to be considered as statements of real-world wisdom or knowledge.
## Prompt
The prompt used:
```json
Generate a JSON-formatted list of synthetically generated quotes on various topics, ensuring that each entry follows the specified structure:
'''json
[
{
"title": "...",
"description": "...",
"topic": "..."
},
]
'''
```
## Disclaimer
Please note that while I strive to maintain data quality, I cannot guarantee the accuracy or quality of all entries in this dataset. Use it responsibly and exercise caution when relying on the data for any critical applications. Your feedback and contributions are greatly appreciated for improving the dataset's overall quality.
|
nks9/NKS_EYE_DISEASE_CLASSIFICATION | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': cataract
'1': diabetic_retinopathy
'2': glaucoma
'3': normal
splits:
- name: train
num_bytes: 711601945.5026784
num_examples: 3795
- name: test
num_bytes: 66862663.13232156
num_examples: 422
download_size: 772276145
dataset_size: 778464608.635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
renascerstudio/lisa | ---
license: openrail
---
|
AlekseyKorshuk/DotCHA-100k | ---
dataset_info:
features:
- name: letter
dtype: string
- name: buckets
sequence:
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1681365292
num_examples: 100000
download_size: 1002860686
dataset_size: 1681365292
---
# Dataset Card for "DotCHA-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.