datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Karthik1080/medquad_llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1345005
num_examples: 1000
download_size: 616713
dataset_size: 1345005
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M | ---
pretty_name: Evaluation run of nicholasKluge/Aira-Instruct-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-Instruct-124M](https://huggingface.co/nicholasKluge/Aira-Instruct-124M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-10T09:14:16.516035](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M/blob/main/results_2023-08-10T09%3A14%3A16.516035.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25097821278031224,\n\
\ \"acc_stderr\": 0.03126312568682377,\n \"acc_norm\": 0.25197883172295243,\n\
\ \"acc_norm_stderr\": 0.03127882498671644,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3793773096260545,\n\
\ \"mc2_stderr\": 0.01493606177741941\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19368600682593856,\n \"acc_stderr\": 0.01154842540997854,\n\
\ \"acc_norm\": 0.2354948805460751,\n \"acc_norm_stderr\": 0.012399451855004753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2909778928500299,\n\
\ \"acc_stderr\": 0.004532850566893526,\n \"acc_norm\": 0.3082055367456682,\n\
\ \"acc_norm_stderr\": 0.004608082815535503\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.021685546665333195,\n\
\ \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.021685546665333195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.1722689075630252,\n \"acc_stderr\": 0.02452866497130541,\n \
\ \"acc_norm\": 0.1722689075630252,\n \"acc_norm_stderr\": 0.02452866497130541\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17218543046357615,\n \"acc_stderr\": 0.030826136961962385,\n \"\
acc_norm\": 0.17218543046357615,\n \"acc_norm_stderr\": 0.030826136961962385\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28440366972477066,\n \"acc_stderr\": 0.019342036587702605,\n \"\
acc_norm\": 0.28440366972477066,\n \"acc_norm_stderr\": 0.019342036587702605\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.02079940008287998,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.02079940008287998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352167,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352167\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455765,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455765\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.01437816988409841,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.01437816988409841\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n\
\ \"acc_stderr\": 0.022268196258783218,\n \"acc_norm\": 0.18971061093247588,\n\
\ \"acc_norm_stderr\": 0.022268196258783218\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.30612244897959184,\n\
\ \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.30612244897959184,\n\
\ \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348377,\n\
\ \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348377\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.23493975903614459,\n \"acc_stderr\": 0.03300533186128922,\n\
\ \"acc_norm\": 0.23493975903614459,\n \"acc_norm_stderr\": 0.03300533186128922\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.036155076303109344,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.036155076303109344\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.014679255032111075,\n\
\ \"mc2\": 0.3793773096260545,\n \"mc2_stderr\": 0.01493606177741941\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-Instruct-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|arc:challenge|25_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hellaswag|10_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T09:14:16.516035.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T09:14:16.516035.parquet'
- config_name: results
data_files:
- split: 2023_08_10T09_14_16.516035
path:
- results_2023-08-10T09:14:16.516035.parquet
- split: latest
path:
- results_2023-08-10T09:14:16.516035.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-124M](https://huggingface.co/nicholasKluge/Aira-Instruct-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T09:14:16.516035](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M/blob/main/results_2023-08-10T09%3A14%3A16.516035.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25097821278031224,
"acc_stderr": 0.03126312568682377,
"acc_norm": 0.25197883172295243,
"acc_norm_stderr": 0.03127882498671644,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3793773096260545,
"mc2_stderr": 0.01493606177741941
},
"harness|arc:challenge|25": {
"acc": 0.19368600682593856,
"acc_stderr": 0.01154842540997854,
"acc_norm": 0.2354948805460751,
"acc_norm_stderr": 0.012399451855004753
},
"harness|hellaswag|10": {
"acc": 0.2909778928500299,
"acc_stderr": 0.004532850566893526,
"acc_norm": 0.3082055367456682,
"acc_norm_stderr": 0.004608082815535503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.022755204959542936,
"acc_norm": 0.2,
"acc_norm_stderr": 0.022755204959542936
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.021685546665333195,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.021685546665333195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.1722689075630252,
"acc_stderr": 0.02452866497130541,
"acc_norm": 0.1722689075630252,
"acc_norm_stderr": 0.02452866497130541
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17218543046357615,
"acc_stderr": 0.030826136961962385,
"acc_norm": 0.17218543046357615,
"acc_norm_stderr": 0.030826136961962385
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28440366972477066,
"acc_stderr": 0.019342036587702605,
"acc_norm": 0.28440366972477066,
"acc_norm_stderr": 0.019342036587702605
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.02079940008287998,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.02079940008287998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455765,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.01437816988409841,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.01437816988409841
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783218,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783218
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348377,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348377
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3793773096260545,
"mc2_stderr": 0.01493606177741941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
theblackcat102/audio-alpaca | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- automatic-speech-recognition
language:
- en
size_categories:
- 10K<n<100K
--- |
yuyijiong/FoodSafe-Doc-QA-Chinese | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
## 食品安全领域指令微调数据
* 包含两个任务:多文档QA、论文QA
* 文档数据来自中国食品安全国标、教材、综述论文 |
Matthijs/cmu-arctic-xvectors | ---
pretty_name: CMU ARCTIC X-Vectors
task_categories:
- text-to-speech
- audio-to-audio
license: mit
---
# Speaker embeddings extracted from CMU ARCTIC
There is one `.npy` file for each utterance in the dataset, 7931 files in total. The speaker embeddings are 512-element X-vectors.
The [CMU ARCTIC](http://www.festvox.org/cmu_arctic/) dataset divides the utterances among the following speakers:
- bdl (US male)
- slt (US female)
- jmk (Canadian male)
- awb (Scottish male)
- rms (US male)
- clb (US female)
- ksp (Indian male)
The X-vectors were extracted using [this script](https://huggingface.co/mechanicalsea/speecht5-vc/blob/main/manifest/utils/prep_cmu_arctic_spkemb.py), which uses the `speechbrain/spkrec-xvect-voxceleb` model.
Usage:
```python
from datasets import load_dataset
embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation")
speaker_embeddings = embeddings_dataset[7306]["xvector"]
speaker_embeddings = torch.tensor(speaker_embeddings).unsqueeze(0)
```
|
CyberHarem/sonoda_umi_lovelive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sonoda_umi/園田海未/소노다우미 (Love Live!)
This is the dataset of sonoda_umi/園田海未/소노다우미 (Love Live!), containing 500 images and their tags.
The core tags of this character are `long_hair, blue_hair, yellow_eyes, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 716.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sonoda_umi_lovelive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 393.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sonoda_umi_lovelive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1212 | 832.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sonoda_umi_lovelive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 621.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sonoda_umi_lovelive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1212 | 1.17 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sonoda_umi_lovelive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sonoda_umi_lovelive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, fur_trim, blush, capelet, smile, open_mouth, beret, detached_sleeves, shorts, ribbon |
| 1 | 5 |  |  |  |  |  | 1girl, earrings, looking_at_viewer, smile, solo, white_gloves, blue_necktie, fingerless_gloves, open_mouth, short_sleeves, beret, blush, simple_background, skirt, swept_bangs |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, blush, flower, blue_dress, choker, earrings, open_mouth, white_gloves, bare_shoulders, hair_ornament, cowboy_shot, head_wreath, simple_background |
| 3 | 27 |  |  |  |  |  | 1girl, solo, hair_flower, looking_at_viewer, blush, bracelet, navel, bikini_skirt, hairband, smile, earrings, necklace, breasts, brown_eyes, frilled_bikini |
| 4 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, bare_shoulders, detached_sleeves, holding_bouquet, simple_background, bridal_veil, bride, hair_flower, petals, strapless_dress, wedding_dress, white_background, blush, choker, garter_straps, jewelry, open_mouth, rose, thighhighs, tiara, white_dress |
| 5 | 5 |  |  |  |  |  | 1girl, blush, hair_flower, short_sleeves, smile, solo, looking_at_viewer, swept_bangs, open_mouth, ribbon, sailor_collar, school_uniform, upper_body, dress, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, air_bubble, jewelry, looking_at_viewer, solo, underwater, smile, circlet, coral, dress, high_heels, midriff, navel, white_gloves |
| 7 | 6 |  |  |  |  |  | 1girl, angel_wings, blush, feathered_wings, hair_flower, headset, looking_at_viewer, solo, dress, x_hair_ornament, bare_shoulders, butterfly_hair_ornament, microphone, white_wings, barefoot, open_mouth, ribbon, smile |
| 8 | 14 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, maid_headdress, simple_background, smile, double_bun, maid_apron, white_background, cowboy_shot, detached_sleeves, earrings, ribbon |
| 9 | 5 |  |  |  |  |  | 1girl, blush, brown_eyes, looking_at_viewer, otonokizaka_school_uniform, solo, bow, simple_background, summer_uniform, sweater_vest, white_background, shirt, skirt, aqua_panties, embarrassed, no_pants, short_sleeves, sitting, smile, socks, striped |
| 10 | 14 |  |  |  |  |  | long_sleeves, otonokizaka_school_uniform, red_bowtie, striped_bowtie, winter_uniform, 1girl, blazer, pleated_skirt, smile, solo, blush, looking_at_viewer, plaid_skirt, cowboy_shot, open_mouth, simple_background |
| 11 | 7 |  |  |  |  |  | 1girl, closed_mouth, otonokizaka_school_uniform, red_bowtie, solo, upper_body, blazer, blush, looking_at_viewer, smile, striped_bowtie, simple_background, white_shirt, winter_uniform, long_sleeves, white_background, blue_jacket, floating_hair |
| 12 | 5 |  |  |  |  |  | 1girl, blush, otonokizaka_school_uniform, pleated_skirt, red_bowtie, short_sleeves, solo, striped_bowtie, summer_uniform, white_shirt, looking_at_viewer, sitting, sweater_vest, blue_skirt, plaid_skirt, indoors, kneehighs, open_mouth |
| 13 | 7 |  |  |  |  |  | 1girl, blush, day, sky, solo, looking_at_viewer, smile, sun_hat, cloud, outdoors, sundress, white_dress, sleeveless, bare_shoulders, ocean, ribbon |
| 14 | 7 |  |  |  |  |  | 1girl, blush, floral_print, kimono, looking_at_viewer, smile, solo, hair_flower, wide_sleeves, obi, detached_sleeves, holding |
| 15 | 14 |  |  |  |  |  | 1girl, muneate, kyuudou, solo, yugake, holding_bow_(weapon), single_glove, hakama_skirt, blush, holding_arrow, looking_at_viewer |
| 16 | 6 |  |  |  |  |  | 1girl, solo, black_gloves, blush, checkered_skirt, looking_at_viewer, striped_thighhighs, treble_clef, flower, microphone_stand, mini_top_hat, sleeveless, earrings, necktie, open_mouth, sitting, smile |
| 17 | 13 |  |  |  |  |  | 1girl, bun_cover, double_bun, solo, looking_at_viewer, china_dress, blush, smile, fingerless_gloves, side_slit, open_mouth, boots, floral_print, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | fur_trim | blush | capelet | smile | open_mouth | beret | detached_sleeves | shorts | ribbon | earrings | white_gloves | blue_necktie | fingerless_gloves | short_sleeves | simple_background | skirt | swept_bangs | flower | blue_dress | choker | bare_shoulders | hair_ornament | cowboy_shot | head_wreath | hair_flower | bracelet | navel | bikini_skirt | hairband | necklace | breasts | brown_eyes | frilled_bikini | holding_bouquet | bridal_veil | bride | petals | strapless_dress | wedding_dress | white_background | garter_straps | jewelry | rose | thighhighs | tiara | white_dress | sailor_collar | school_uniform | upper_body | dress | air_bubble | underwater | circlet | coral | high_heels | midriff | angel_wings | feathered_wings | headset | x_hair_ornament | butterfly_hair_ornament | microphone | white_wings | barefoot | maid_headdress | double_bun | maid_apron | otonokizaka_school_uniform | bow | summer_uniform | sweater_vest | shirt | aqua_panties | embarrassed | no_pants | sitting | socks | striped | long_sleeves | red_bowtie | striped_bowtie | winter_uniform | blazer | pleated_skirt | plaid_skirt | closed_mouth | white_shirt | blue_jacket | floating_hair | blue_skirt | indoors | kneehighs | day | sky | sun_hat | cloud | outdoors | sundress | sleeveless | ocean | floral_print | kimono | wide_sleeves | obi | holding | muneate | kyuudou | yugake | holding_bow_(weapon) | single_glove | hakama_skirt | holding_arrow | black_gloves | checkered_skirt | striped_thighhighs | treble_clef | microphone_stand | mini_top_hat | necktie | bun_cover | china_dress | side_slit | boots |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:-----------|:--------|:----------|:--------|:-------------|:--------|:-------------------|:---------|:---------|:-----------|:---------------|:---------------|:--------------------|:----------------|:--------------------|:--------|:--------------|:---------|:-------------|:---------|:-----------------|:----------------|:--------------|:--------------|:--------------|:-----------|:--------|:---------------|:-----------|:-----------|:----------|:-------------|:-----------------|:------------------|:--------------|:--------|:---------|:------------------|:----------------|:-------------------|:----------------|:----------|:-------|:-------------|:--------|:--------------|:----------------|:-----------------|:-------------|:--------|:-------------|:-------------|:----------|:--------|:-------------|:----------|:--------------|:------------------|:----------|:------------------|:--------------------------|:-------------|:--------------|:-----------|:-----------------|:-------------|:-------------|:-----------------------------|:------|:-----------------|:---------------|:--------|:---------------|:--------------|:-----------|:----------|:--------|:----------|:---------------|:-------------|:-----------------|:-----------------|:---------|:----------------|:--------------|:---------------|:--------------|:--------------|:----------------|:-------------|:----------|:------------|:------|:------|:----------|:--------|:-----------|:-----------|:-------------|:--------|:---------------|:---------|:---------------|:------|:----------|:----------|:----------|:---------|:-----------------------|:---------------|:---------------|:----------------|:---------------|:------------------|:---------------------|:--------------|:-------------------|:---------------|:----------|:------------|:--------------|:------------|:--------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | X | X | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 27 |  |  |  |  |  | X | X | X | | X | | X | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | | | | | | | X | | | | | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | | | X | | | | | X | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | | | | X | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 14 |  |  |  |  |  | X | X | X | | X | | X | | | X | | X | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | | X | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 14 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | X | | | | X | X | | | X | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 14 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 15 | 14 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 16 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | |
| 17 | 13 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
jlbaker361/prior-reward | ---
dataset_info:
features:
- name: man
dtype: image
- name: woman
dtype: image
- name: boy
dtype: image
- name: girl
dtype: image
- name: character
dtype: image
- name: person
dtype: image
splits:
- name: train
num_bytes: 75309398.0
num_examples: 28
download_size: 75317328
dataset_size: 75309398.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_2_t_0.9 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43729495
num_examples: 18928
- name: epoch_1
num_bytes: 44301102
num_examples: 18928
- name: epoch_2
num_bytes: 44373560
num_examples: 18928
- name: epoch_3
num_bytes: 44407111
num_examples: 18928
- name: epoch_4
num_bytes: 44414707
num_examples: 18928
- name: epoch_5
num_bytes: 44399743
num_examples: 18928
- name: epoch_6
num_bytes: 44388093
num_examples: 18928
- name: epoch_7
num_bytes: 44380265
num_examples: 18928
- name: epoch_8
num_bytes: 44375264
num_examples: 18928
- name: epoch_9
num_bytes: 44372364
num_examples: 18928
- name: epoch_10
num_bytes: 44371662
num_examples: 18928
- name: epoch_11
num_bytes: 44370199
num_examples: 18928
- name: epoch_12
num_bytes: 44372011
num_examples: 18928
- name: epoch_13
num_bytes: 44371609
num_examples: 18928
- name: epoch_14
num_bytes: 44369899
num_examples: 18928
- name: epoch_15
num_bytes: 44370705
num_examples: 18928
- name: epoch_16
num_bytes: 44371750
num_examples: 18928
- name: epoch_17
num_bytes: 44370318
num_examples: 18928
- name: epoch_18
num_bytes: 44371110
num_examples: 18928
- name: epoch_19
num_bytes: 44372001
num_examples: 18928
- name: epoch_20
num_bytes: 44370853
num_examples: 18928
- name: epoch_21
num_bytes: 44369629
num_examples: 18928
- name: epoch_22
num_bytes: 44371126
num_examples: 18928
- name: epoch_23
num_bytes: 44371020
num_examples: 18928
- name: epoch_24
num_bytes: 44369679
num_examples: 18928
- name: epoch_25
num_bytes: 44371722
num_examples: 18928
- name: epoch_26
num_bytes: 44370162
num_examples: 18928
- name: epoch_27
num_bytes: 44371241
num_examples: 18928
- name: epoch_28
num_bytes: 44370149
num_examples: 18928
- name: epoch_29
num_bytes: 44369233
num_examples: 18928
download_size: 814280532
dataset_size: 1330557782
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
KK1mo/tedigan_gen_All | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 354831267.0
num_examples: 2998
download_size: 354734037
dataset_size: 354831267.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gbharti/wealth-alpaca_lora | ---
language:
- en
---
This dataset is a combination of Stanford's Alpaca (https://github.com/tatsu-lab/stanford_alpaca) and FiQA (https://sites.google.com/view/fiqa/) with another 1.3k pairs custom generated using GPT3.5
Script for tuning through Kaggle's (https://www.kaggle.com) free resources using PEFT/LoRa: https://www.kaggle.com/code/gbhacker23/wealth-alpaca-lora |
MottsCoding/meltpools1k | ---
dataset_info:
features:
- name: images
dtype: image
- name: labels
sequence:
sequence: int32
splits:
- name: train
num_bytes: 50697182.0
num_examples: 9
download_size: 0
dataset_size: 50697182.0
---
# Dataset Card for "meltpools1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/vr_val_free_4 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 7119718719
num_examples: 10000
download_size: 1206748228
dataset_size: 7119718719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MysteriousAI__Mia-001 | ---
pretty_name: Evaluation run of MysteriousAI/Mia-001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MysteriousAI/Mia-001](https://huggingface.co/MysteriousAI/Mia-001) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MysteriousAI__Mia-001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T16:52:06.968259](https://huggingface.co/datasets/open-llm-leaderboard/details_MysteriousAI__Mia-001/blob/main/results_2024-04-02T16-52-06.968259.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23731668171739753,\n\
\ \"acc_stderr\": 0.03006408174110251,\n \"acc_norm\": 0.23714256037066517,\n\
\ \"acc_norm_stderr\": 0.03085442849124215,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359654,\n \"mc2\": 0.4825112031965409,\n\
\ \"mc2_stderr\": 0.01620968936835715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2022184300341297,\n \"acc_stderr\": 0.011737454431872105,\n\
\ \"acc_norm\": 0.22781569965870307,\n \"acc_norm_stderr\": 0.012256708602326919\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27454690300736906,\n\
\ \"acc_stderr\": 0.004453735900947837,\n \"acc_norm\": 0.2802230631348337,\n\
\ \"acc_norm_stderr\": 0.0044819026375056675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n\
\ \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n\
\ \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.19622641509433963,\n \"acc_stderr\": 0.024442388131100844,\n\
\ \"acc_norm\": 0.19622641509433963,\n \"acc_norm_stderr\": 0.024442388131100844\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149353,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149353\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.02895734278834235,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.02895734278834235\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.0292255758924896,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.0292255758924896\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586825,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586825\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860674,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128002,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128002\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304523,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304523\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091112,\n \"\
acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091112\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n\
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.02197419884826581,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.02197419884826581\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348787,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348787\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02456220431414232,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02456220431414232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n\
\ \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n\
\ \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.19183673469387755,\n\
\ \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772436,\n\
\ \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772436\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359654,\n\
\ \"mc2\": 0.4825112031965409,\n \"mc2_stderr\": 0.01620968936835715\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.516179952644041,\n\
\ \"acc_stderr\": 0.014045126130978601\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/MysteriousAI/Mia-001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|arc:challenge|25_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|gsm8k|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hellaswag|10_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-52-06.968259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T16-52-06.968259.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- '**/details_harness|winogrande|5_2024-04-02T16-52-06.968259.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T16-52-06.968259.parquet'
- config_name: results
data_files:
- split: 2024_04_02T16_52_06.968259
path:
- results_2024-04-02T16-52-06.968259.parquet
- split: latest
path:
- results_2024-04-02T16-52-06.968259.parquet
---
# Dataset Card for Evaluation run of MysteriousAI/Mia-001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MysteriousAI/Mia-001](https://huggingface.co/MysteriousAI/Mia-001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MysteriousAI__Mia-001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T16:52:06.968259](https://huggingface.co/datasets/open-llm-leaderboard/details_MysteriousAI__Mia-001/blob/main/results_2024-04-02T16-52-06.968259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23731668171739753,
"acc_stderr": 0.03006408174110251,
"acc_norm": 0.23714256037066517,
"acc_norm_stderr": 0.03085442849124215,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359654,
"mc2": 0.4825112031965409,
"mc2_stderr": 0.01620968936835715
},
"harness|arc:challenge|25": {
"acc": 0.2022184300341297,
"acc_stderr": 0.011737454431872105,
"acc_norm": 0.22781569965870307,
"acc_norm_stderr": 0.012256708602326919
},
"harness|hellaswag|10": {
"acc": 0.27454690300736906,
"acc_stderr": 0.004453735900947837,
"acc_norm": 0.2802230631348337,
"acc_norm_stderr": 0.0044819026375056675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19622641509433963,
"acc_stderr": 0.024442388131100844,
"acc_norm": 0.19622641509433963,
"acc_norm_stderr": 0.024442388131100844
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149353,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149353
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.02895734278834235,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.02895734278834235
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.3,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.0292255758924896,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.0292255758924896
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586825,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860674,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304523,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304523
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1908256880733945,
"acc_stderr": 0.016847676400091112,
"acc_norm": 0.1908256880733945,
"acc_norm_stderr": 0.016847676400091112
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.02197419884826581,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.02197419884826581
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348787,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348787
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02456220431414232,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02456220431414232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359654,
"mc2": 0.4825112031965409,
"mc2_stderr": 0.01620968936835715
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978601
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KaiLv/UDR_CNNDailyMail | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: article
dtype: string
- name: highlights
dtype: string
- name: len_article
dtype: int64
- name: len_highlights
dtype: int64
splits:
- name: train
num_bytes: 453635426
num_examples: 155098
- name: validation
num_bytes: 21468466
num_examples: 7512
- name: test
num_bytes: 18215547
num_examples: 6379
- name: debug
num_bytes: 292572035
num_examples: 100000
download_size: 484340245
dataset_size: 785891474
---
# Dataset Card for "UDR_CNNDailyMail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruanchaves/lynx | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
language:
- code
license:
- unknown
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- structure-prediction
- code-generation
- conditional-text-generation
task_ids: []
pretty_name: Lynx
tags:
- word-segmentation
---
# Dataset Card for Lynx
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** [Helpful or Not? An investigation on the feasibility of identifier splitting via CNN-BiLSTM-CRF](https://ksiresearch.org/seke/seke18paper/seke18paper_167.pdf)
### Dataset Summary
In programming languages, identifiers are tokens (also called symbols) which name language entities.
Some of the kinds of entities an identifier might denote include variables, types, labels, subroutines, and packages.
Lynx is a dataset for identifier segmentation, i.e. the task of adding spaces between the words on a identifier.
Besides identifier segmentation, the gold labels for this dataset also include abbreviation expansion.
### Languages
- C
## Dataset Structure
### Data Instances
```
{
"index": 3,
"identifier": "abspath",
"segmentation": "abs path",
"expansion": "absolute path",
"spans": {
"text": [
"abs"
],
"expansion": [
"absolute"
],
"start": [
0
],
"end": [
4
]
}
}
```
### Data Fields
- `index`: a numerical index.
- `identifier`: the original identifier.
- `segmentation`: the gold segmentation for the identifier, without abbreviation expansion.
- `expansion`: the gold segmentation for the identifier, with abbreviation expansion.
- `spans`: the start and end index of each abbreviation, the text of the abbreviation and its corresponding expansion.
## Dataset Creation
- All hashtag segmentation and identifier splitting datasets on this profile have the same basic fields: `hashtag` and `segmentation` or `identifier` and `segmentation`.
- The only difference between `hashtag` and `segmentation` or between `identifier` and `segmentation` are the whitespace characters. Spell checking, expanding abbreviations or correcting characters to uppercase go into other fields.
- There is always whitespace between an alphanumeric character and a sequence of any special characters ( such as `_` , `:`, `~` ).
- If there are any annotations for named entity recognition and other token classification tasks, they are given in a `spans` field.
### Citation Information
```
@inproceedings{madani2010recognizing,
title={Recognizing words from source code identifiers using speech recognition techniques},
author={Madani, Nioosha and Guerrouj, Latifa and Di Penta, Massimiliano and Gueheneuc, Yann-Gael and Antoniol, Giuliano},
booktitle={2010 14th European Conference on Software Maintenance and Reengineering},
pages={68--77},
year={2010},
organization={IEEE}
}
```
### Contributions
This dataset was added by [@ruanchaves](https://github.com/ruanchaves) while developing the [hashformers](https://github.com/ruanchaves/hashformers) library. |
PlanTL-GOB-ES/MLDoc |
---
YAML tags:
annotations_creators:
- expert-generated
language:
- es
language_creators:
- found
multilinguality:
- multilingual
pretty_name: MLDoc
license: cc-by-nc-4.0
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-classification
task_ids: []
---
# MLDoc
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Website:** https://github.com/facebookresearch/MLDoc
### Dataset Summary
For document classification, we use the Multilingual Document Classification Corpus (MLDoc) [(Schwenk and Li, 2018)](http://www.lrec-conf.org/proceedings/lrec2018/pdf/658.pdf), a cross-lingual document classification dataset covering 8 languages. We use the Spanish portion to evaluate our models on monolingual classification as part of the EvalEs Spanish language benchmark. The corpus consists of 14,458 news articles from Reuters classified in four categories: Corporate/Industrial, Economics, Government/Social and Markets.
This dataset can't be downloaded straight from HuggingFace as it requires signing specific agreements. The detailed instructions on how to download it can be found in this [repository](https://github.com/facebookresearch/MLDoc).
### Supported Tasks and Leaderboards
Text Classification
### Languages
The dataset is in English, German, French, Spanish, Italian, Russian, Japanese and Chinese.
## Dataset Structure
### Data Instances
<pre>
MCAT b' FRANCFORT, 17 feb (Reuter) - La Bolsa de Francfort abri\xc3\xb3 la sesi\xc3\xb3n de corros con baja por la ca\xc3\xadda del viernes en Wall Street y una toma de beneficios. El d\xc3\xb3lar ayudaba a apuntalar al mercado, que pronto podr\xc3\xada reanudar su tendencia alcista. Volkswagen bajaba por los da\xc3\xb1os ocasionados por la huelga de camioneros en Espa\xc3\xb1a. Preussag participaba en un joint venture de exploraci\xc3\xb3n petrol\xc3\xadfera en Filipinas con Atlantic Richfield Co. A las 0951 GMT, el Dax 30 bajaba 10,49 puntos, un 0,32 pct, a 3.237,69 tras abrir a un m\xc3\xa1ximo de 3.237,69. (c) Reuters Limited 1997. '
</pre>
### Data Fields
- Label: CCAT (Corporate/Industrial), ECAT (Economics), GCAT (Government/Social) and MCAT (Markets)
- Text
### Data Splits
- train.tsv: 9,458 lines
- valid.tsv: 1,000 lines
- test.tsv: 4,000 lines
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
The source data is from the Reuters Corpus. In 2000, Reuters Ltd made available a large collection of Reuters News stories for use in research and development of natural language processing, information retrieval, and machine learning systems. This corpus, known as "Reuters Corpus, Volume 1" or RCV1, is significantly larger than the older, well-known Reuters-21578 collection heavily used in the text classification community.
For more information visit the paper [(Lewis et al., 2004)](https://www.jmlr.org/papers/volume5/lewis04a/lewis04a.pdf).
#### Initial Data Collection and Normalization
For more information visit the paper [(Lewis et al., 2004)](https://www.jmlr.org/papers/volume5/lewis04a/lewis04a.pdf).
#### Who are the source language producers?
For more information visit the paper [(Lewis et al., 2004)](https://www.jmlr.org/papers/volume5/lewis04a/lewis04a.pdf).
### Annotations
#### Annotation process
For more information visit the paper [(Schwenk and Li, 2018; Lewis et al., 2004)](http://www.lrec-conf.org/proceedings/lrec2018/pdf/658.pdf).
#### Who are the annotators?
For more information visit the paper [(Schwenk and Li, 2018; Lewis et al., 2004)](http://www.lrec-conf.org/proceedings/lrec2018/pdf/658.pdf).
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contributes to the development of language models in Spanish.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
[N/A]
### Licensing Information
Access to the actual news stories of the Reuters Corpus (both RCV1 and RCV2) requires a NIST agreement. The stories in the Reuters Corpus are under the copyright of Reuters Ltd and/or Thomson Reuters, and their use is governed by the following agreements:
- Organizational agreement: This agreement must be signed by the person responsible for the data at your organization, and sent to NIST.
- Individual agreement: This agreement must be signed by all researchers using the Reuters Corpus at your organization, and kept on file at your organization.
For more information about the agreement see [here](https://trec.nist.gov/data/reuters/reuters.html)
### Citation Information
The following paper must be cited when using this corpus:
```
@InProceedings{SCHWENK18.658,
author = {Holger Schwenk and Xian Li},
title = {A Corpus for Multilingual Document Classification in Eight Languages},
booktitle = {Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
year = {2018},
month = {may},
date = {7-12},
location = {Miyazaki, Japan},
editor = {Nicoletta Calzolari (Conference chair) and Khalid Choukri and Christopher Cieri and Thierry Declerck and Sara Goggi and Koiti Hasida and Hitoshi Isahara and Bente Maegaard and Joseph Mariani and Hélène Mazo and Asuncion Moreno and Jan Odijk and Stelios Piperidis and Takenobu Tokunaga},
publisher = {European Language Resources Association (ELRA)},
address = {Paris, France},
isbn = {979-10-95546-00-9},
language = {english}
}
@inproceedings{schwenk-li-2018-corpus,
title = "A Corpus for Multilingual Document Classification in Eight Languages",
author = "Schwenk, Holger and
Li, Xian",
booktitle = "Proceedings of the Eleventh International Conference on Language Resources and Evaluation ({LREC} 2018)",
month = may,
year = "2018",
address = "Miyazaki, Japan",
publisher = "European Language Resources Association (ELRA)",
url = "https://aclanthology.org/L18-1560",
}
```
|
Emm9625/SlimOrca-dedupe-4k | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 6608271.511536737
num_examples: 4000
- name: test
num_bytes: 660827.1511536737
num_examples: 400
download_size: 3673854
dataset_size: 7269098.66269041
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
HADESJUDGEMENT/Art | ---
license: unknown
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/107a1506 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "107a1506"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cloudyu__mistral_28B_instruct_v0.1 | ---
pretty_name: Evaluation run of cloudyu/mistral_28B_instruct_v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/mistral_28B_instruct_v0.1](https://huggingface.co/cloudyu/mistral_28B_instruct_v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__mistral_28B_instruct_v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T07:24:18.194696](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mistral_28B_instruct_v0.1/blob/main/results_2024-03-05T07-24-18.194696.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6048165720098292,\n\
\ \"acc_stderr\": 0.03318918937904906,\n \"acc_norm\": 0.6103008292910913,\n\
\ \"acc_norm_stderr\": 0.03386256440684409,\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.6416534233937677,\n\
\ \"mc2_stderr\": 0.015338593594048558\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.53839590443686,\n \"acc_stderr\": 0.014568245550296358,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.01440561827943618\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6085441147181836,\n\
\ \"acc_stderr\": 0.004870785036708288,\n \"acc_norm\": 0.8053176658036247,\n\
\ \"acc_norm_stderr\": 0.0039514673865977306\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518025,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518025\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686859,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686859\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143698,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143698\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903226,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903226\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4061277705345502,\n\
\ \"acc_stderr\": 0.012543154588412942,\n \"acc_norm\": 0.4061277705345502,\n\
\ \"acc_norm_stderr\": 0.012543154588412942\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411955,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411955\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.6416534233937677,\n\
\ \"mc2_stderr\": 0.015338593594048558\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.01219848910025978\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.379833206974981,\n \
\ \"acc_stderr\": 0.013368818096960496\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/mistral_28B_instruct_v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|arc:challenge|25_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|gsm8k|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hellaswag|10_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T07-24-18.194696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T07-24-18.194696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- '**/details_harness|winogrande|5_2024-03-05T07-24-18.194696.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T07-24-18.194696.parquet'
- config_name: results
data_files:
- split: 2024_03_05T07_24_18.194696
path:
- results_2024-03-05T07-24-18.194696.parquet
- split: latest
path:
- results_2024-03-05T07-24-18.194696.parquet
---
# Dataset Card for Evaluation run of cloudyu/mistral_28B_instruct_v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/mistral_28B_instruct_v0.1](https://huggingface.co/cloudyu/mistral_28B_instruct_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__mistral_28B_instruct_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T07:24:18.194696](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__mistral_28B_instruct_v0.1/blob/main/results_2024-03-05T07-24-18.194696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6048165720098292,
"acc_stderr": 0.03318918937904906,
"acc_norm": 0.6103008292910913,
"acc_norm_stderr": 0.03386256440684409,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6416534233937677,
"mc2_stderr": 0.015338593594048558
},
"harness|arc:challenge|25": {
"acc": 0.53839590443686,
"acc_stderr": 0.014568245550296358,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.01440561827943618
},
"harness|hellaswag|10": {
"acc": 0.6085441147181836,
"acc_stderr": 0.004870785036708288,
"acc_norm": 0.8053176658036247,
"acc_norm_stderr": 0.0039514673865977306
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518025,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518025
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686859,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686859
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143698,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608408,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903226,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903226
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4061277705345502,
"acc_stderr": 0.012543154588412942,
"acc_norm": 0.4061277705345502,
"acc_norm_stderr": 0.012543154588412942
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6416534233937677,
"mc2_stderr": 0.015338593594048558
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.01219848910025978
},
"harness|gsm8k|5": {
"acc": 0.379833206974981,
"acc_stderr": 0.013368818096960496
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilabel-internal-testing/instruction-dataset-sample | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: test
num_bytes: 8005
num_examples: 10
download_size: 8330
dataset_size: 8005
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
qwtreue5u5trhdfgh/IllusionDiffusionGC | ---
license: mit
---
|
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4 | ---
pretty_name: Evaluation run of jondurbin/airoboros-65b-gpt4-1.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-65b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T17:40:12.862636](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4/blob/main/results_2023-10-29T17-40-12.862636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.048133389261744965,\n\
\ \"em_stderr\": 0.0021920523387187097,\n \"f1\": 0.11759647651006695,\n\
\ \"f1_stderr\": 0.0024834303220337473,\n \"acc\": 0.4884045517729164,\n\
\ \"acc_stderr\": 0.010955153685387409\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.048133389261744965,\n \"em_stderr\": 0.0021920523387187097,\n\
\ \"f1\": 0.11759647651006695,\n \"f1_stderr\": 0.0024834303220337473\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18043972706595907,\n \
\ \"acc_stderr\": 0.010592508589147896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626922\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|arc:challenge|25_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T05_17_46.947970
path:
- '**/details_harness|drop|3_2023-10-23T05-17-46.947970.parquet'
- split: 2023_10_23T05_35_00.206888
path:
- '**/details_harness|drop|3_2023-10-23T05-35-00.206888.parquet'
- split: 2023_10_29T17_40_12.862636
path:
- '**/details_harness|drop|3_2023-10-29T17-40-12.862636.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T17-40-12.862636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T05_17_46.947970
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-17-46.947970.parquet'
- split: 2023_10_23T05_35_00.206888
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-35-00.206888.parquet'
- split: 2023_10_29T17_40_12.862636
path:
- '**/details_harness|gsm8k|5_2023-10-29T17-40-12.862636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T17-40-12.862636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hellaswag|10_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T18:17:34.414751.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T18:17:34.414751.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T05_17_46.947970
path:
- '**/details_harness|winogrande|5_2023-10-23T05-17-46.947970.parquet'
- split: 2023_10_23T05_35_00.206888
path:
- '**/details_harness|winogrande|5_2023-10-23T05-35-00.206888.parquet'
- split: 2023_10_29T17_40_12.862636
path:
- '**/details_harness|winogrande|5_2023-10-29T17-40-12.862636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T17-40-12.862636.parquet'
- config_name: results
data_files:
- split: 2023_08_09T18_17_34.414751
path:
- results_2023-08-09T18:17:34.414751.parquet
- split: 2023_10_23T05_17_46.947970
path:
- results_2023-10-23T05-17-46.947970.parquet
- split: 2023_10_23T05_35_00.206888
path:
- results_2023-10-23T05-35-00.206888.parquet
- split: 2023_10_29T17_40_12.862636
path:
- results_2023-10-29T17-40-12.862636.parquet
- split: latest
path:
- results_2023-10-29T17-40-12.862636.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T17:40:12.862636](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4/blob/main/results_2023-10-29T17-40-12.862636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.048133389261744965,
"em_stderr": 0.0021920523387187097,
"f1": 0.11759647651006695,
"f1_stderr": 0.0024834303220337473,
"acc": 0.4884045517729164,
"acc_stderr": 0.010955153685387409
},
"harness|drop|3": {
"em": 0.048133389261744965,
"em_stderr": 0.0021920523387187097,
"f1": 0.11759647651006695,
"f1_stderr": 0.0024834303220337473
},
"harness|gsm8k|5": {
"acc": 0.18043972706595907,
"acc_stderr": 0.010592508589147896
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626922
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ch08931/sandyOV2 | ---
license: openrail
---
|
YuuSorata/Ellis | ---
license: openrail
---
|
tyzhu/squad_qa_no_id_v5_full_random_permute_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5042862.974380165
num_examples: 3365
- name: validation
num_bytes: 342766
num_examples: 300
download_size: 1264341
dataset_size: 5385628.974380165
---
# Dataset Card for "squad_qa_no_id_v5_full_random_permute_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mainuzzaman/newcoursedataset | ---
license: apache-2.0
---
|
NexaAI/fill50K | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2025713512.0
num_examples: 50000
download_size: 1910547936
dataset_size: 2025713512.0
---
|
ruanchaves/bt11 | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
language:
- code
license:
- unknown
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- structure-prediction
task_ids: []
pretty_name: BT11
tags:
- word-segmentation
---
# Dataset Card for BT11
## Dataset Description
- **Paper:** [Helpful or Not? An investigation on the feasibility of identifier splitting via CNN-BiLSTM-CRF](https://ksiresearch.org/seke/seke18paper/seke18paper_167.pdf)
### Dataset Summary
In programming languages, identifiers are tokens (also called symbols) which name language entities.
Some of the kinds of entities an identifier might denote include variables, types, labels, subroutines, and packages.
BT11 is a dataset for identifier segmentation, i.e. the task of adding spaces between the words on a identifier.
### Languages
- Java
## Dataset Structure
### Data Instances
```
{
"index": 20170,
"identifier": "currentLineHighlight",
"segmentation": "current Line Highlight"
}
```
### Data Fields
- `index`: a numerical index.
- `identifier`: the original identifier.
- `segmentation`: the gold segmentation for the identifier.
## Dataset Creation
- All hashtag segmentation and identifier splitting datasets on this profile have the same basic fields: `hashtag` and `segmentation` or `identifier` and `segmentation`.
- The only difference between `hashtag` and `segmentation` or between `identifier` and `segmentation` are the whitespace characters. Spell checking, expanding abbreviations or correcting characters to uppercase go into other fields.
- There is always whitespace between an alphanumeric character and a sequence of any special characters ( such as `_` , `:`, `~` ).
- If there are any annotations for named entity recognition and other token classification tasks, they are given in a `spans` field.
## Additional Information
### Citation Information
```
@inproceedings{butler2011improving,
title={Improving the tokenisation of identifier names},
author={Butler, Simon and Wermelinger, Michel and Yu, Yijun and Sharp, Helen},
booktitle={European Conference on Object-Oriented Programming},
pages={130--154},
year={2011},
organization={Springer}
}
```
### Contributions
This dataset was added by [@ruanchaves](https://github.com/ruanchaves) while developing the [hashformers](https://github.com/ruanchaves/hashformers) library. |
puhsu/tabular-benchmarks | ---
task_categories:
- tabular-classification
- tabular-regression
pretty_name: tabualar-benchmarks
---
Datasets used in the paper TODO
To download the archive you could use:
```bash
wget https://huggingface.co/datasets/puhsu/tabular-benchmarks/resolve/main/data.tar
``` |
r0ll/Ivanzolo2004 | ---
license: openrail
language:
- ru
--- |
open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat | ---
pretty_name: Evaluation run of bofenghuang/vigogne-2-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bofenghuang/vigogne-2-7b-chat](https://huggingface.co/bofenghuang/vigogne-2-7b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T13:35:42.061271](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat/blob/main/results_2023-09-22T13-35-42.061271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2779991610738255,\n\
\ \"em_stderr\": 0.0045880722162316605,\n \"f1\": 0.32825188758389273,\n\
\ \"f1_stderr\": 0.004516960799751206,\n \"acc\": 0.41080456661279235,\n\
\ \"acc_stderr\": 0.00980948368433141\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2779991610738255,\n \"em_stderr\": 0.0045880722162316605,\n\
\ \"f1\": 0.32825188758389273,\n \"f1_stderr\": 0.004516960799751206\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \
\ \"acc_stderr\": 0.007357713523222347\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bofenghuang/vigogne-2-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|drop|3_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T13-35-42.061271.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T13-35-42.061271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T13_29_32.035608
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:29:32.035608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T13:29:32.035608.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- '**/details_harness|winogrande|5_2023-09-22T13-35-42.061271.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T13-35-42.061271.parquet'
- config_name: results
data_files:
- split: 2023_09_22T13_35_42.061271
path:
- results_2023-09-22T13-35-42.061271.parquet
- split: latest
path:
- results_2023-09-22T13-35-42.061271.parquet
---
# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bofenghuang/vigogne-2-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-2-7b-chat](https://huggingface.co/bofenghuang/vigogne-2-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:35:42.061271](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat/blob/main/results_2023-09-22T13-35-42.061271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2779991610738255,
"em_stderr": 0.0045880722162316605,
"f1": 0.32825188758389273,
"f1_stderr": 0.004516960799751206,
"acc": 0.41080456661279235,
"acc_stderr": 0.00980948368433141
},
"harness|drop|3": {
"em": 0.2779991610738255,
"em_stderr": 0.0045880722162316605,
"f1": 0.32825188758389273,
"f1_stderr": 0.004516960799751206
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.007357713523222347
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/fbd4ca1b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1333
dataset_size: 186
---
# Dataset Card for "fbd4ca1b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Amanaccessassist/playsent10m | ---
dataset_info:
features:
- name: final_text
dtype: string
- name: label
dtype:
dtype:
class_label:
names:
'0': Negative
'1': Positive
splits:
- name: train
num_bytes: 61468420.68408292
num_examples: 644415
- name: test
num_bytes: 6829856.31591708
num_examples: 71602
download_size: 37372122
dataset_size: 68298277.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nikchar/claim_detection_paper_test_squeezebert | ---
dataset_info:
features:
- name: label
dtype: string
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: Is_Claim
dtype: string
- name: Claim_detection_result
dtype: string
splits:
- name: train
num_bytes: 1175947
num_examples: 11073
download_size: 507280
dataset_size: 1175947
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "claim_detection_paper_test_squeezebert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gcjavi/dataviewer-test | ---
annotations_creators:
- expert-generated
language:
- gl
license:
- mit
multilinguality:
- monolingual
dataset_info:
- config_name: config
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
CristoJV/image-folder | ---
license: mit
---
|
minnesotanlp/LLM-Artifacts | ---
configs:
- config_name: task_label
data_files: "intermodel_cleaned_maj_min.csv"
- config_name: preference_p2c
data_files: "p2c_human_gpt3_pref.csv"
- config_name: preference_cobbler_GPT4
data_files: "cobbler_gpt4.csv"
- config_name: preference_cobbler_ChatGPT
data_files: "cobbler_chatgpt.csv"
- config_name: instruction
data_files: "first_order_annotations.csv"
- config_name: simulation_roleflip
data_files: "CAMEL_annotated.csv"
- config_name: simulation_digression
data_files: "spp_digression_fin.csv"
- config_name: freeform_deepfake_human
data_files: "deepfake_human.csv"
- config_name: freeform_deepfake_machine
data_files: "deepfake_machine-002.csv"
- config_name: freeform_hc3_human
data_files: "hc3_human.csv"
- config_name: freeform_hc3_machine
data_files: "hc3_machine.csv"
- config_name: freeform_worker_human
data_files: "worker_human.csv"
- config_name: freeform_worker_machine
data_files: "worker_machine.csv"
- config_name: qual_tasklabel
data_files: "qual_tasklabel.csv"
- config_name: qual_preference_p2c
data_files: "qual_preference_p2c.csv"
- config_name: qual_freeform
data_files: "qual_freetext.csv"
---
<div align="center">
<h1>Under the Surface: Tracking the Artifactuality of LLM-Generated Data</h1>
<!-- **Authors:** -->
_**Debarati Das<sup>†</sup><sup>¶</sup>, Karin de Langis<sup>¶</sup>, Anna Martin-Boyle<sup>¶</sup>, Jaehyung Kim<sup>¶</sup>, Minhwa Lee<sup>¶</sup>, Zae Myung Kim<sup>¶</sup><br>**_
_**Shirley Anugrah Hayati, Risako Owan, Bin Hu, Ritik Sachin Parkar, Ryan Koo,
Jong Inn Park, Aahan Tyagi, Libby Ferland, Sanjali Roy, Vincent Liu**_
_**Dongyeop Kang<br>**_
_**Minnesota NLP, University of Minnesota Twin Cities**_
<!-- **Affiliations:** -->
<sup>†</sup> Project Lead,
<sup>¶</sup> Core Contribution,
<a href="https://arxiv.org/abs/2401.14698"> Arxiv </a>
<a href="https://minnesotanlp.github.io/artifact/"> Project Page </a>
</div>
## 📌 Table of Contents
- [Introduction](#🚀-introduction)
- [Dataset Structure](#📝-dataset)
- [Task Label](#1-task-label)
- [Preference](#2-preference)
- [Instructions](#3-instructions)
- [Simulation](#4-simulation)
- [Free-form Text](#5-free-form-text)
- [Citation](#📚-citation)
## 🚀 Introduction
<div align="center">
<img src="iceberg_modified.png" style="width:50%;height:auto;" align="center">
</div>
We present a pioneering effort in gathering a diverse range of text data produced by LLMs, covering everything from more structured "task labels" to open-ended "free-form text." This comprehensive collection is significant as it allows for a unique and holistic examination of LLM outputs and provides insights into how LLMs perform under varying degrees of structure and freedom, which is essential for both understanding their current state and guiding future improvements and applications.
We aggregate and conduct comprehensive stress tests on various data generated by LLMs using the existing benchmarks, offering a thorough evaluation of the quality, consistency, and reliability of LLM outputs across diverse models and scenarios, thereby providing a groundbreaking insight into their strengths and weaknesses for future research and development.
Our research emphasizes the critical need for responsible and ethical practices in creating and using LLM-generated data, advocating for collaborative efforts among stakeholders to address biases, increase diversity, and deepen the understanding of complex human opinions in LLM outputs, thereby ensuring their development benefits society ethically and sustainably.
## 📝 Dataset
The dataset consists of **five** different types of LLM-generated data: **(1) Task Labels, (2) Preference, (3) Instructions, (4) Simulation, and (5) Free-form Texts**.
<hr>
### 1. Task Label
#### (1) Dataset Info
Contains human/machine annotations from source datasets and their majority/minority label aggregations.
#### (2) Data Sources - License
- [Social Bias Frames (SBIC)](https://huggingface.co/datasets/social_bias_frames) - cc-by-4.0
- [GAB Hate Corpus (GHC)](https://osf.io/edua3/) - cc-by-4.0 International
- [Age-Related-Sentiment (Sentiment)](https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/F6EMTS) - cc-by-1.0 Universal
- [Social Chemistry (Schem5Labels)](https://github.com/mbforbes/social-chemistry-101) - CC BY-SA 4.0
#### (3) Column Info
- `'model_name'`: specifies the model that was prompted to generate the model annotations for the text. This can take values: vicuna, baize,llama2, koala, open_ai_gpt35turbo
- `'dataset_name'`: specifies the source dataset of the text. This can take values: SBIC, GHC, Sentiment, and Schem5Labels
- `'text_ind'`: this is the unique index of the text in the complete dataset
- `'text'`: this is the text which the human or machine needs to provide an annotation for
- `'prompt'`: This is the prompt provided to the model for the annotation task
- `'human_annots'`: This consists of the list of annotations generated by human annotators for this task. These are ordinal categorical variables.
- `'model_annots'`: This consists of the list of annotations generated by model annotators for this task. These are ordinal categorical variables. If a value is -1 in this list, it means the model did not return a response for this text.
- `'human_majority'`: this consists of a list containing the majority annotation value(s) among the human-annotated list for that text.
- `'machine_majority'`: this consists of a list containing the majority annotation value(s) among the machine-annotated list for that text.
- `'human_minority'`: this consists of a list containing the minority annotation value(s) among the human-annotated list for that text.
- `'machine_minority'`: this consists of a list containing the minority annotation value(s) among the machine-annotated list for that text.
#### (4) How to access
There is one subset associated with this data type:
- **task_label**: intermodel setup with majority/minority opinions aggregated from all data sources
Use the example code below to load the task label split. Change the split name.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "task_label", split='train') # streaming=True (optional)
```
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis regarding bias annotations, please copy and paste the below code:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_tasklabel", split='train')
```
#### (6) Others
For majority/minority calculation, please note the following:
- A list of values that are the majority or minority values in the passed list is returned. For example, if the given input list is [1.0,1.0,2.0,2.0,3.0], then majority value will be [1.0,2.0] and the minority value will be [3.0]
- If all values in the annotation list are -1, then no valid majority or minority can be calculated. Therefore, None is returned.
- If all unique values are present in the annotation list, then no valid majority or minority can be calculated. Therefore, None is returned.
<hr>
### 2. Preference
#### (1) Dataset Info
Contains Human/Machine Preferences from source datasets and their locality lexicon (for p2c) and entailment (for CoBBLEr) preference.
#### (2) Data Sources (License)
- [Prefer to Classify ('p2c')](https://arxiv.org/pdf/2306.04925.pdf)
- Note that the sentences are originally extracted from [DynaSent Round 2](https://huggingface.co/datasets/dynabench/dynasent/viewer/dynabench.dynasent.r2.all)
- [CoBBLEr](https://minnesotanlp.github.io/cobbler-project-page/demo/index.html)
- The sentences are originally extracted from [Eli5](https://huggingface.co/datasets/eli5) and [BigBench](https://huggingface.co/datasets/bigbench).
#### (3) Column Info
Commonly for each row, there are a pair of sentences ('sent_1' and 'sent_2'), with human and machine preferences.
- Preference Label 0: prefer sent_1
- Preference Label 1: prefer sent_2
- Preference Label 2: tie (no preference)
For p2c dataset, there are the sentiment lexicon-based preference and the difference score between the two sentences in each row.
- `'sent_1'`: sentence 1 of a pair
- `'sent_2'`: sentence 2 of a pair
- `'gold_label'`: the gold sentiment label of both `'sent_1'` and `'sent_2'` (e.g., positive/negative/neutral)
- `'human_pref'`: human preference
- `'gpt3_pref'`: GPT-3 preference
- `'lexicon_pref'`: the lexicon-based preference between `'sent_1'` and `'sent_2'`
- `'lexicon_diff'`: the difference in lexicon scores between sentence pairs
For CoBBLEr dataset, there are textual entailment-based preferences and difference scores between the sentences in each row.
- `'model_1'`: the model name that generated sentence 1
- `'model_2'`: the model name that generated sentence 2
- `'sentence_1'`: sentence 1 of a pair
- `'sentence_2'`: sentence 2 of a pair
- `'human_pref'`: human preference
- `'machine_pref'`: LLM preference (GPT-4 or ChatGPT)
- `'entail_pref'`: the entailment-based preference between `'sentence_1'` and `'sentence_2'`
- `'entail_diff'`: the difference in entailment scores (computed by RoBERTa-large-MNLI) between two sentences in a pair.
#### (4) How to access
There are three subsets associated with this data type:
- **preference_p2c**: p2c data with human and GPT-3 preferences
- **preference_cobbler_gpt4**: cobbler data with human and GPT-4 preferences
- **preference_cobbler_chatgpt**: cobbler with human and ChatGPT preferences
Use the example code below to load the subset of preference_cobbler_gpt4. Change the subset name.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "preference_cobbler_gpt4", split='train')
```
#### (5) Qualitative Analysis
For `'p2c'` dataset, we release the data with each sentence in a pair annotated with extracted lexicons based on [Hayati et al (2021)](https://aclanthology.org/2021.emnlp-main.510/).
Also, for several columns in this data, their value consists of a dictionary where each key is the extracted lexicon and its value is the corresponding importance.
For example, the column `'sent_{1/2}_anger'`is a dictionary of anger-related lexicons with the corresponding importance scores in the (first/second) sentence.
Our study uses the first key with the maximum value score in each lexicon group to decide lexicon-based preferences.
To use this dataset, please note the following:
```python
import pandas as pd
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_preference_p2c", split='train')
dataset = pd.DataFrame(dataset)
```
For sentence pairs of positive sentiment, we used the following columns:
- `'sent_{1/2}_{joy/politeness}_words'` and
- `'sent_{1/2}_sentiment_words'` that has values of greater than 0 (positive).
Conversely, for the pairs of negative sentiments, we used the following columns:
- `'sent_{1/2}_{anger/disgust/fear/sad/offensive}_words'`,
- `'sent_{1/2}_polite_words'` that has values of below 0 (rudeness) and
- `'sent_{1/2}_sentiment_words'` that has values of below 0 (negative).
#### (6) Others
<hr>
### 3. Instructions
#### (1) Dataset Info
(1) Human annotations of error types in 800 examples from four different synthetic instruction datasets, and (2) three random samplings of 10k samples for each of the following datasets: Cleaned Alpaca, Dolly, Self Instruct, and Supernatural Instructions. There is a total of 30k samples for each of the datasets (3 seeds each).
#### (2) Data Sources (License)
- [Unnatural Instructions](https://github.com/orhonovich/unnatural-instructions) - MIT
- [Self-Instruct](https://github.com/yizhongw/self-instruct) - Apache License 2.0
- [Alpaca-Cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) - Creative Commons NonCommercial (CC BY-NC 4.0).
- [GPT-4-LLM](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM) - Creative Commons NonCommercial (CC BY-NC 4.0).
- [Dolly](https://github.com/databrickslabs/dolly) - Apache License 2.0
- [Supernatural Instructions](https://github.com/allenai/natural-instructions) - Apache License 2.0
#### (3) Column Info
(1) Error Annotations
- `'instruction'`: an instruction to follow
- `'constraints'`: samples from the Unnatural Instruction set have an additional data type called `'constraints'`, which specify the form the output should take (e.g. `output should be 'True' or 'False'`)
- `'input'`: an input to the corresponding instruction
- `'output'`: the output given the corresponding instruction and the input
- `'dataset'`: the name of the source dataset that the instruction comes from
- `'QA_type'`: the question-answer type (Open-QA or Closed-QA)
- `'error'`: the error type (one of the following: incomprehensible instruction, inconsistent input, inconsistent output, and incorrect output)
- `'second_error'`: sometimes a sample contains more than one error; a second error will be denoted in this column
- `'third_error'`: a third error will be denoted in this column
#### (4) How to access
(1) Error Annotations:
- **instruction**: first-order experiment setup with error type annotations aggregated from all data sources
Use the example code below to load the instruction subset.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "instruction", split='train')
```
#### (5) Qualitative Analysis
The `'instruction'` experiment is based on the manual annotations of each error type found in the synthetic datasets.
Thus, if you want to view examples for qualitative analysis, use the same split information as below:
```python
from datasets import load_dataset
import pandas as pd
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "instruction", split='train')
data = pd.read_csv(dataset)
```
#### (6) Others
**For the second-order experiment,**
Please use [this dataset (`instruction_fine-tuning_data.csv`)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/instruction_fine-tuning_data.csv).
The following is the column information:
- `'task_name'`: the name of the instruction task. Only pertains to Supernatural Instructions
- `'id'`: the Supernatural Instruction id
- `'instruction'`: an instruction to follow
- `'input'`: an input to the corresponding instruction
- `'output'`: the output given the corresponding instruction and the input
- `'categories'`: the task type. Only pertains to Supernatural Instructions
- `'source'`: the instruction source
- `'seed'`: the seed used for the random sampling. One of the following: 2021, 2022, or 2023
<hr>
### 4. Simulation
#### (1) Dataset Info
Contains (1) role-flipping information or (2) types of error in digression in simulated agent conversations.
#### (2) Data Sources (License)
- [CAMEL AI-Society](https://huggingface.co/datasets/camel-ai/ai_society) - CC-BY-NC 4.0
- [Solo Performance Prompting Grid-World (SPP)](https://github.com/MikeWangWZHL/Solo-Performance-Prompting) - N/A
#### (3) Column Info
(1) Regarding 'CAMEL':
- `'role_flipping_msg_indices'`: a list of indices of role-flipped messages in the conversation
- `'interruption_msg_indices'`: a list of indices of interruption messages in the conversation
- `'role_flipping_happens'`: boolean true when role_flipping_msg_indices is not empty
(2) Regarding 'SPP':
- `'Given Task'`: Given questions with detailed descriptions. The questions are from SPP logic grid puzzle dataset.
- `'Task Label'`: Answer to the given question, which is originally provided by SPP dataset
- `'Response of GPT-4'`: Simulated conversations by multiple agents, generated by GPT-4. These responses are also from SPP dataset itself (method-“spp_engine-devgpt4-32k_temp-0.0_topp-1.0_start0-end200__with_sys_mes”).
- `'Prediction of digression by GPT-4'`: Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation.
- `'Reasoning of digression by GPT-4'`: Reasoning about (d) the prediction of digression.
- `'Classification of digression'`: For the simulated conversation predicted to have digression by (d), we further classify the types of digression using GPT-4 again. For the data without digression, this field is provided with ‘N/A’.
- `'Prediction as human-like by GPT-4'`: Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation.
- `'Reasoning as human-like by GPT-4'`: Reasoning about (g) the prediction as human-like.
- `'Prediction of digression by Human Annotators'`: Binary prediction (yes or no) about the existence of digression within (c) the simulated conversation, by three different human annotators.
- `'Prediction as human-like by Human Annotators'`: Binary prediction (human or ai) about the likeliness of (c) given conversation as human’s conversation, by three different human annotators.
#### (4) How to access
There are two subsets associated with this data type:
- **simulation_roleflip**: role-flipping information from CAMEL AI Society dataset
- **simulation_digression**: digression type information from SPP dataset
Use the example code below to load the digression subset. Change the subset name like this:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "simulation_digression", split="train")
```
#### (5) Qualitative Analysis
Only the subset **simulation_digression** contains human/GPT annotations for each simulated conversation between agents.
Therefore, please use the following code to view the qualitative analysis part of the simulation section:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "simulation_digression", split="train", streaming=True)
```
#### (6) Others
To get a better prediction and corresponding reasoning for it, please first generate the prediction, and then generate the reasoning as provided in the code.
<hr>
### 5. Free-form Text
#### (1) Dataset Info
Contains Human/Machine texts from source datasets and their classification scores.
If a machine text has a paired human text, the human text's id is associated with the machine texts.
#### (2) Data Sources - License
- [Workers vs GPT ('Workers')](https://github.com/AndersGiovanni/worker_vs_gpt) - MIT
- [Human ChatGPT Comparison Corpus ('HC3')](https://huggingface.co/datasets/Hello-SimpleAI/HC3) - BSD License
- [Deepfake Text Detection in the Wild ('Deepfake')](https://huggingface.co/datasets/yaful/DeepfakeTextDetect) - Apache License 2.0
#### (3) Column Info
**Human data** – 'text', 'label', 'id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
<br>
**Machine data** – 'text', 'label', 'model', 'strat', 'human_id', 'anger', 'disgust', 'fear', 'joy', 'neutral', 'sadness', 'surprise', 'irony', 'toxicity', 'formality', 'metaphor'
- `'strat'` is the prompting strat; this is relevant for only a subset of the data; 'human_id' is the id of the human text that is its pair, if any
- `'label'` is the label for text classification
- Other attributes are just outputs from classifiers, so not GTs
#### (4) How to access
There are six subsets associated with this data type:
- **freeform_deepfake_{(human, machine)}**: human/machine outputs from Deepfake dataset
- **freeform_hc3_{(human, machine)}**: human/machine outputs from HC3 dataset
- **freeform_workers_{(human, machine)}**: human/machine outputs from Workers dataset
Use the example code below to load the subset of human outputs from deepfake dataset.
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "freeform_deepfake_human", split="train")
```
#### (5) Qualitative Analysis
To view examples used in the qualitative analysis, please copy and paste the below code:
```python
from datasets import load_dataset
dataset = load_dataset("minnesotanlp/LLM-Artifacts", "qual_freeform", split="train")
```
#### (6) Others
**For Discourse artifact analyses**, please download the following two pickle files to see the network motifs:
- [Network Motiffs (Validation)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/DeepfakeTextDetect.validation.discourse_added.networkx_added.motifs_added.pkl)
- [Network Motiffs (Test)](https://huggingface.co/datasets/minnesotanlp/LLM-Artifacts/resolve/main/DeepfakeTextDetect.test.discourse_added.networkx_added.motifs_added.pkl)
<hr>
## 📚 Citation
If you use our paper or this dataset in your research, please cite it as follows:
```bibtex
@misc{das2024surface,
title={Under the Surface: Tracking the Artifactuality of LLM-Generated Data},
author={Debarati Das and Karin De Langis and Anna Martin and Jaehyung Kim and Minhwa Lee and Zae Myung Kim and Shirley Hayati and Risako Owan and Bin Hu and Ritik Parkar and Ryan Koo and Jonginn Park and Aahan Tyagi and Libby Ferland and Sanjali Roy and Vincent Liu and Dongyeop Kang},
year={2024},
eprint={2401.14698},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
If you have any questions or feedback, please feel free to reach out at lee03533@umn.edu.
<!-- # 🤝 Contributing -->
|
alex-miller/cdp-paf-meta-synthetic | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 2228611.265822785
num_examples: 3850
- name: test
num_bytes: 390197.5
num_examples: 495
download_size: 1017815
dataset_size: 2618808.765822785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hotal/emergency_classification | ---
task_categories:
- text-classification
language:
- en
pretty_name: emergency_classification
---
# Emergency Messages Classification Dataset |
Yorth/dalleTestDataFiltered2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: resolution
dtype: string
splits:
- name: train
num_bytes: 1219548166.4310646
num_examples: 4580
download_size: 1215430674
dataset_size: 1219548166.4310646
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Multimodal-Fatima/Caltech101_not_background_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 95535909.0
num_examples: 5647
download_size: 98967583
dataset_size: 95535909.0
---
# Dataset Card for "Caltech101_not_background_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30 | ---
pretty_name: Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T02:42:36.258115](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-26T02-42-36.258115.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2954068791946309,\n\
\ \"em_stderr\": 0.004672175556184236,\n \"f1\": 0.3814209312080561,\n\
\ \"f1_stderr\": 0.004573085663083055,\n \"acc\": 0.44525521893903264,\n\
\ \"acc_stderr\": 0.012103729416391124\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2954068791946309,\n \"em_stderr\": 0.004672175556184236,\n\
\ \"f1\": 0.3814209312080561,\n \"f1_stderr\": 0.004573085663083055\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20621683093252463,\n \
\ \"acc_stderr\": 0.011144364089781436\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6842936069455406,\n \"acc_stderr\": 0.01306309474300081\n\
\ }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T02_42_36.258115
path:
- '**/details_harness|drop|3_2023-10-26T02-42-36.258115.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T02-42-36.258115.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T02_42_36.258115
path:
- '**/details_harness|gsm8k|5_2023-10-26T02-42-36.258115.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T02-42-36.258115.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-01-15.573690.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T02_42_36.258115
path:
- '**/details_harness|winogrande|5_2023-10-26T02-42-36.258115.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T02-42-36.258115.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_01_15.573690
path:
- results_2023-10-10T07-01-15.573690.parquet
- split: 2023_10_26T02_42_36.258115
path:
- results_2023-10-26T02-42-36.258115.parquet
- split: latest
path:
- results_2023-10-26T02-42-36.258115.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Pwen-7B-Chat-20_30
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Pwen-7B-Chat-20_30](https://huggingface.co/JosephusCheung/Pwen-7B-Chat-20_30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T02:42:36.258115](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Pwen-7B-Chat-20_30/blob/main/results_2023-10-26T02-42-36.258115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2954068791946309,
"em_stderr": 0.004672175556184236,
"f1": 0.3814209312080561,
"f1_stderr": 0.004573085663083055,
"acc": 0.44525521893903264,
"acc_stderr": 0.012103729416391124
},
"harness|drop|3": {
"em": 0.2954068791946309,
"em_stderr": 0.004672175556184236,
"f1": 0.3814209312080561,
"f1_stderr": 0.004573085663083055
},
"harness|gsm8k|5": {
"acc": 0.20621683093252463,
"acc_stderr": 0.011144364089781436
},
"harness|winogrande|5": {
"acc": 0.6842936069455406,
"acc_stderr": 0.01306309474300081
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SamukaBrenner20/milesmoralesinsomniac | ---
license: openrail
---
|
cezar-dan-licenta-2023/RoCoLe-Variants | ---
dataset_info:
- config_name: original
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': healthy
'1': red_spider_mite
'2': rust_mild
'3': rust_medium
'4': rust_severe
splits:
- name: train
num_bytes: 12381788
num_examples: 936
- name: validation
num_bytes: 4157145
num_examples: 312
- name: test
num_bytes: 4136740
num_examples: 312
download_size: 21824000
dataset_size: 20675673
- config_name: no-background
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': healthy
'1': red_spider_mite
'2': rust_mild
'3': rust_medium
'4': rust_severe
splits:
- name: train
num_bytes: 12381788
num_examples: 936
- name: validation
num_bytes: 4157145
num_examples: 312
- name: test
num_bytes: 4136740
num_examples: 312
download_size: 21824000
dataset_size: 20675673
---
|
heliosprime/twitter_dataset_1713147558 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 4375
num_examples: 12
download_size: 8770
dataset_size: 4375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713147558"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-eds2 | ---
pretty_name: Evaluation run of YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2](https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-eds2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T23:45:49.787204](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-eds2/blob/main/results_2024-01-25T23-45-49.787204.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64746083935561,\n\
\ \"acc_stderr\": 0.03226131925966174,\n \"acc_norm\": 0.6469056397588263,\n\
\ \"acc_norm_stderr\": 0.032934341547214946,\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660877,\n \"mc2\": 0.7224895419988183,\n\
\ \"mc2_stderr\": 0.014945205654147512\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907586,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710693\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7401911969727146,\n\
\ \"acc_stderr\": 0.004376333451909804,\n \"acc_norm\": 0.8922525393347939,\n\
\ \"acc_norm_stderr\": 0.003094275186361527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265026,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265026\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.01658868086453063,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.01658868086453063\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532069,\n\
\ \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532069\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660877,\n \"mc2\": 0.7224895419988183,\n\
\ \"mc2_stderr\": 0.014945205654147512\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6550416982562547,\n \
\ \"acc_stderr\": 0.013093630133666238\n }\n}\n```"
repo_url: https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|arc:challenge|25_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|arc:challenge|25_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|gsm8k|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|gsm8k|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hellaswag|10_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hellaswag|10_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-34-45.930119.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-45-49.787204.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T23-45-49.787204.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- '**/details_harness|winogrande|5_2024-01-25T23-34-45.930119.parquet'
- split: 2024_01_25T23_45_49.787204
path:
- '**/details_harness|winogrande|5_2024-01-25T23-45-49.787204.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T23-45-49.787204.parquet'
- config_name: results
data_files:
- split: 2024_01_25T23_34_45.930119
path:
- results_2024-01-25T23-34-45.930119.parquet
- split: 2024_01_25T23_45_49.787204
path:
- results_2024-01-25T23-45-49.787204.parquet
- split: latest
path:
- results_2024-01-25T23-45-49.787204.parquet
---
# Dataset Card for Evaluation run of YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2](https://huggingface.co/YouKnwMe/Mistral-7b-instruct-v0.2-private-eds2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-eds2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T23:45:49.787204](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnwMe__Mistral-7b-instruct-v0.2-private-eds2/blob/main/results_2024-01-25T23-45-49.787204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.64746083935561,
"acc_stderr": 0.03226131925966174,
"acc_norm": 0.6469056397588263,
"acc_norm_stderr": 0.032934341547214946,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660877,
"mc2": 0.7224895419988183,
"mc2_stderr": 0.014945205654147512
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907586,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710693
},
"harness|hellaswag|10": {
"acc": 0.7401911969727146,
"acc_stderr": 0.004376333451909804,
"acc_norm": 0.8922525393347939,
"acc_norm_stderr": 0.003094275186361527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265026,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.01658868086453063,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.01658868086453063
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532069,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660877,
"mc2": 0.7224895419988183,
"mc2_stderr": 0.014945205654147512
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272962
},
"harness|gsm8k|5": {
"acc": 0.6550416982562547,
"acc_stderr": 0.013093630133666238
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LLM-Editor/animals | ---
dataset_info:
features:
- name: Animal
dtype: string
- name: Taxonomic Rank
dtype: string
- name: Definition
dtype: string
splits:
- name: train
num_bytes: 52655
num_examples: 300
download_size: 7036
dataset_size: 52655
---
# Dataset Card for "animals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Toflamus/alpaca_data_split | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 17099808.501826853
num_examples: 46801
- name: test
num_bytes: 1900303.4981731472
num_examples: 5201
download_size: 12068449
dataset_size: 19000112.0
---
# Dataset Card for "alpaca_data_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
betogaunt2/vozbeto | ---
license: openrail
---
|
irds/lotte_recreation_test | ---
pretty_name: '`lotte/recreation/test`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/recreation/test`
The `lotte/recreation/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/recreation/test).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=166,975
This dataset is used by: [`lotte_recreation_test_forum`](https://huggingface.co/datasets/irds/lotte_recreation_test_forum), [`lotte_recreation_test_search`](https://huggingface.co/datasets/irds/lotte_recreation_test_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_recreation_test', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
Tobius/teric_asr_lab | ---
license: cc0-1.0
---
|
yijingwu/HeySQuAD_human | ---
license: cc-by-4.0
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
list:
- name: answer_start
dtype: int64
- name: text
dtype: string
- name: is_impossible
dtype: bool
- name: id
dtype: string
- name: plausible_answers
list:
- name: answer_start
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 15547759456.9
num_examples: 71990
- name: validation
num_bytes: 867205293.036
num_examples: 4158
download_size: 14616752976
dataset_size: 16414964749.935999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
citation: @misc{wu2023heysquad, title={HeySQuAD: A Spoken Question Answering Dataset}, author={Yijing Wu and SaiKrishna Rallabandi and Ravisutha Srinivasamurthy and Parag Pravin Dakle and Alolika Gon and Preethi Raghavan}, year={2023}, eprint={2304.13689}, archivePrefix={arXiv}, primaryClass={cs.CL} } |
biswa921/hiwiki-latest-pages-articles-raw | ---
license: apache-2.0
dataset_info:
features:
- name: title
dtype: string
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 742313430
num_examples: 231836
download_size: 268567208
dataset_size: 742313430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jys09/ntu-adl-hw1 | ---
license: mit
task_categories:
- question-answering
--- |
mozci/logobookDB | ---
language:
- en
license: afl-3.0
size_categories:
- 1K<n<10K
task_categories:
- text-to-image
pretty_name: Logobook Archive with Captions
tags:
- brand
- logo
- design
- graphic design
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 162614866.176
num_examples: 4026
download_size: 139569721
dataset_size: 162614866.176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card
This dataset contains image caption pairs for logo designs screped from logobook.com. It is created for my research project to finetune text-image diffusion models with logo designs.
Logobook.com has a very nice logo archive consisting of modernist and simplistic logo designs. Each design stored along with some keywords. I used these keywords to create a caption for the logo designs.
See example below:

Caption:
Adams Law, a prominent law firm in Ireland, features a sleek and professional logo design by Jeremy Simmons of Process. The logo showcases a symbolic letter 'A' enclosed within a circular frame, representing unity and integrity. The inclusion of the word 'Ireland' emphasizes the firm's local expertise and dedication to serving the Irish community. A subtle quotation mark adds a touch of elegance and sophistication, reflecting Adams Law's commitment to delivering impactful legal solutions. This timeless logo design, created in 2017, effectively captures the firm's professionalism and legal expertise.
## Copyright disclaimer
Created and used for research purposes. |
open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B | ---
pretty_name: Evaluation run of marcchew/LaMini-40k-Platypus2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/LaMini-40k-Platypus2-7B](https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T19:17:16.064054](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B/blob/main/results_2023-12-03T19-17-16.064054.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|arc:challenge|25_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T08_58_28.638753
path:
- '**/details_harness|drop|3_2023-10-28T08-58-28.638753.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T08-58-28.638753.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T08_58_28.638753
path:
- '**/details_harness|gsm8k|5_2023-10-28T08-58-28.638753.parquet'
- split: 2023_12_03T19_17_16.064054
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-17-16.064054.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T19-17-16.064054.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hellaswag|10_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T12-51-11.107895.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T12-51-11.107895.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T08_58_28.638753
path:
- '**/details_harness|winogrande|5_2023-10-28T08-58-28.638753.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T08-58-28.638753.parquet'
- config_name: results
data_files:
- split: 2023_09_18T12_51_11.107895
path:
- results_2023-09-18T12-51-11.107895.parquet
- split: 2023_10_28T08_58_28.638753
path:
- results_2023-10-28T08-58-28.638753.parquet
- split: 2023_12_03T19_17_16.064054
path:
- results_2023-12-03T19-17-16.064054.parquet
- split: latest
path:
- results_2023-12-03T19-17-16.064054.parquet
---
# Dataset Card for Evaluation run of marcchew/LaMini-40k-Platypus2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/LaMini-40k-Platypus2-7B](https://huggingface.co/marcchew/LaMini-40k-Platypus2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:17:16.064054](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__LaMini-40k-Platypus2-7B/blob/main/results_2023-12-03T19-17-16.064054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
one-sec-cv12/chunk_3 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18734066352.875
num_examples: 195049
download_size: 15931724724
dataset_size: 18734066352.875
---
# Dataset Card for "chunk_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rasgaard/mmi-bendr-preprocessed | ---
dataset_info:
features:
- name: data
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 531730928
num_examples: 4324
- name: val
num_bytes: 5164824
num_examples: 42
- name: test
num_bytes: 5164824
num_examples: 42
download_size: 207795149
dataset_size: 542060576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
license: odc-by
---
The [EEG Motor Movement/Imagery (MMI) Dataset](https://physionet.org/content/eegmmidb/1.0.0/) preprocessed with [DN3](https://github.com/SPOClab-ca/dn3/) to be used for downstream fine-tuning with [BENDR](https://github.com/SPOClab-ca/BENDR).
The labels correspond to Task 4 (imagine opening and closing both fists or both feet) from experimental runs 4, 10 and 14.
## Creating dataloaders
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset = load_dataset("rasgaard/mmi-bendr-preprocessed")
dataset.set_format("torch")
train_loader = DataLoader(dataset["train"], batch_size=8)
val_loader = DataLoader(dataset["val"], batch_size=8)
test_loader = DataLoader(dataset["test"], batch_size=8)
batch = next(iter(train_loader))
batch["data"].shape, batch["label"].shape
>>> (torch.Size([8, 20, 1536]), torch.Size([8]))
``` |
ksuyash/food-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1716156838.154
num_examples: 6269
download_size: 1579640750
dataset_size: 1716156838.154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
udmurtNLP/zerpal-udmdunne | ---
language:
- udm
size_categories:
- 10K<n<100K
task_categories:
- text-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 47031643
num_examples: 10258
- name: valid
num_bytes: 5299403
num_examples: 1140
download_size: 25632119
dataset_size: 52331046
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
zhiqings/LLaVA-Human-Preference-10K | ---
license: apache-2.0
---
|
Dauren-Nur/ISSAI_SKIMMED | ---
dataset_info:
features:
- name: uttID
dtype: string
- name: deviceID
dtype: int64
- name: text
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1921144536.0
num_examples: 15000
- name: test
num_bytes: 407749055.0
num_examples: 3334
- name: dev
num_bytes: 384006753.0
num_examples: 3283
download_size: 2712080390
dataset_size: 2712900344.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
ihaflix1/herbert | ---
license: openrail
---
|
reaganjlee/boolq_es | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: answer
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 4397871
num_examples: 9427
- name: validation
num_bytes: 1520093
num_examples: 3270
download_size: 3613558
dataset_size: 5917964
---
# Dataset Card for "boolq_es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
docdewarper/dewarpnet_weights | ---
license: mit
---
Weights for the pre-trained DewarpNet model developed by Das et al. (2019).
The code for the model can be found on https://github.com/cvlab-stonybrook/DewarpNet. |
scribis/italian-literature-corpus-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 125104611.3227607
num_examples: 872594
- name: validation
num_bytes: 13900528.299298717
num_examples: 96955
download_size: 143032707
dataset_size: 139005139.6220594
language:
- it
---
# Dataset Card for "italian-literature-corpus-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheFinAI/en-forecasting-bigdata | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 18720287
num_examples: 4897
- name: valid
num_bytes: 1278834
num_examples: 798
- name: test
num_bytes: 2379111
num_examples: 1472
download_size: 11003337
dataset_size: 22378232
---
# Dataset Card for "flare-sm-bigdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KnutJaegersberg/wikipedia_categories | ---
license: mit
---
|
McSpicyWithMilo/target-elements-0.2split-new-delete-180-validation | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: target_element
dtype: string
splits:
- name: train
num_bytes: 12696.8
num_examples: 144
- name: test
num_bytes: 1587.1
num_examples: 18
- name: valid
num_bytes: 1587.1
num_examples: 18
download_size: 13176
dataset_size: 15871.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
# Dataset Card for "target-elements-0.2split-new-delete-180-validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TaylorAI/RLCD-generated-preference-data-split | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: float64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 142629947
num_examples: 160000
- name: validation
num_bytes: 7163731
num_examples: 7999
download_size: 88067760
dataset_size: 149793678
---
# Dataset Card for "RLCD-generated-preference-data-split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KShivendu/dbpedia-entities-openai-1M | ---
license: mit
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: openai
sequence: float32
splits:
- name: train
num_bytes: 12383152
num_examples: 1000000
download_size: 12383152
dataset_size: 1000000
language:
- en
task_categories:
- feature-extraction
pretty_name: OpenAI 1M with DBPedia Entities
size_categories:
- 1M<n<10M
---
1M OpenAI Embeddings -- 1536 dimensions
Created: June 2023.
Text used for Embedding: title (string) + text (string)
Embedding Model: text-embedding-ada-002
First used for the pgvector vs VectorDB (Qdrant) benchmark: https://nirantk.com/writing/pgvector-vs-qdrant/
### Future work
We are planning to take this up to 10M (and possibly 100M) vectors. Contact [@KShivendu_](https://twitter.com/KShivendu_) on Twitter or mail to hello@nirantk.com if you want to help :)
### Credits:
This dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity |
sophie-shetty08/alzheimer-master-dataset | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_null_referential_pronouns | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 529226
num_examples: 2610
- name: test
num_bytes: 5308180
num_examples: 26108
- name: train
num_bytes: 5043940
num_examples: 24291
download_size: 6908933
dataset_size: 10881346
---
# Dataset Card for "MULTI_VALUE_qqp_null_referential_pronouns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Biomedical-TeMU/ProfNER_corpus_classification | ---
license: cc-by-4.0
---
|
CyberHarem/manannan_mac_lir_bazett_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of manannan_mac_lir_bazett/マナナン・マク・リール〔バゼット〕/马纳南·麦克·利尔〔巴泽特〕 (Fate/Grand Order)
This is the dataset of manannan_mac_lir_bazett/マナナン・マク・リール〔バゼット〕/马纳南·麦克·利尔〔巴泽特〕 (Fate/Grand Order), containing 78 images and their tags.
The core tags of this character are `mole, mole_under_eye, breasts, parted_bangs, purple_hair, red_hair, red_eyes, long_hair, purple_eyes, large_breasts, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 78 | 104.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manannan_mac_lir_bazett_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 78 | 91.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manannan_mac_lir_bazett_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 176 | 175.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manannan_mac_lir_bazett_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/manannan_mac_lir_bazett_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_gloves, black_necktie, black_pants, collared_shirt, dress_shirt, formal, grey_coat, long_sleeves, adjusting_gloves, open_coat, solo, black_suit, looking_at_viewer, simple_background, white_background, jacket |
| 1 | 24 |  |  |  |  |  | 1girl, black_bodysuit, looking_at_viewer, low_ponytail, blush, solo, long_sleeves, necktie, jewelry, black_gloves, blue_bodysuit |
| 2 | 12 |  |  |  |  |  | 1girl, bare_shoulders, smile, white_dress, elbow_gloves, solo, white_gloves, cleavage, looking_at_viewer, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_necktie | black_pants | collared_shirt | dress_shirt | formal | grey_coat | long_sleeves | adjusting_gloves | open_coat | solo | black_suit | looking_at_viewer | simple_background | white_background | jacket | black_bodysuit | low_ponytail | blush | necktie | jewelry | blue_bodysuit | bare_shoulders | smile | white_dress | elbow_gloves | white_gloves | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:----------------|:--------------|:-----------------|:--------------|:---------|:------------|:---------------|:-------------------|:------------|:-------|:-------------|:--------------------|:--------------------|:-------------------|:---------|:-----------------|:---------------|:--------|:----------|:----------|:----------------|:-----------------|:--------|:--------------|:---------------|:---------------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | X | | | | | | | X | | | X | | X | | | | X | X | X | X | X | X | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | | | | | | | | | | X | | X | | | | | | X | | | | X | X | X | X | X | X |
|
tharun-6743/tharun | ---
license: openrail
---
|
Devartbio/landing | ---
license: mit
---
|
georgereyna/omni | ---
license: cc
---
|
MicPie/unpredictable_cappex-com | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cappex.com
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cappex.com" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied | ---
pretty_name: Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T06:44:24.493952](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-26T06-44-24.493952.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.0585476090604028,\n\
\ \"f1_stderr\": 0.0013740361163735455,\n \"acc\": 0.3926358910777041,\n\
\ \"acc_stderr\": 0.010089987799825416\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n\
\ \"f1\": 0.0585476090604028,\n \"f1_stderr\": 0.0013740361163735455\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.007390654481108214\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.01278932111854262\n\
\ }\n}\n```"
repo_url: https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T06_44_24.493952
path:
- '**/details_harness|drop|3_2023-10-26T06-44-24.493952.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T06-44-24.493952.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T06_44_24.493952
path:
- '**/details_harness|gsm8k|5_2023-10-26T06-44-24.493952.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T06-44-24.493952.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T07-25-43.126145.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T06_44_24.493952
path:
- '**/details_harness|winogrande|5_2023-10-26T06-44-24.493952.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T06-44-24.493952.parquet'
- config_name: results
data_files:
- split: 2023_10_10T07_25_43.126145
path:
- results_2023-10-10T07-25-43.126145.parquet
- split: 2023_10_26T06_44_24.493952
path:
- results_2023-10-26T06-44-24.493952.parquet
- split: latest
path:
- results_2023-10-26T06-44-24.493952.parquet
---
# Dataset Card for Evaluation run of hiyouga/Baichuan2-7B-Base-LLaMAfied
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hiyouga/Baichuan2-7B-Base-LLaMAfied](https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T06:44:24.493952](https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied/blob/main/results_2023-10-26T06-44-24.493952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.0585476090604028,
"f1_stderr": 0.0013740361163735455,
"acc": 0.3926358910777041,
"acc_stderr": 0.010089987799825416
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.0585476090604028,
"f1_stderr": 0.0013740361163735455
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108214
},
"harness|winogrande|5": {
"acc": 0.7071823204419889,
"acc_stderr": 0.01278932111854262
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
eswanYS/customhkcode2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
minhnguyen/IEDB_B_cell | ---
license: apache-2.0
---
|
apollo-research/sae-skeskinen-TinyStories-hf-tokenizer-gpt2_play | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1911420483
num_examples: 2119719
- name: validation
num_bytes: 19306310
num_examples: 21990
download_size: 1001364597
dataset_size: 1930726793
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CATIE-AQ/orange_sum_fr_prompt_fill_mask | ---
language:
- fr
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- fill-mask
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- orange_sum
---
# orange_sum_fr_prompt_fill_mask
## Summary
**orange_sum_fr_prompt_fill_mask** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **585,624** rows that can be used for a fill mask task.
The original data (without prompts) comes from the dataset [orange_sum](https://huggingface.co/datasets/orange_sum) by Eddine et al.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Remplacer le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Remplace le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Remplacez le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Remplacer le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Remplace le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Remplacez le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Substituer le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Substitue le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Substituez le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Substituer le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Substitue le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Substituez le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Changer le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Change le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Changez le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Changer le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Change le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Changez le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Transformer le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Transforme le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Transformez le <mask> dans le texte suivant par le mot le plus vraisemblable : '+text,
'Transformer le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Transforme le <mask> dans le texte suivant par le mot le plus probable : '+text,
'Transformez le <mask> dans le texte suivant par le mot le plus probable : '+text,
```
# Splits
- `train` with 513,624 samples
- `valid` with 36,000 samples
- `test` with 36,000 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/orange_sum_fr_prompt_fill_mask")
```
# Citation
## Original data
> @article{eddine2020barthez,
title={BARThez: a Skilled Pretrained French Sequence-to-Sequence Model},
author={Eddine, Moussa Kamal and Tixier, Antoine J-P and Vazirgiannis, Michalis},
journal={arXiv preprint arXiv:2010.12321},
year={2020}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC-BY-SA-4.0
|
benticha/coachllm-lama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 238155
num_examples: 996
download_size: 75613
dataset_size: 238155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Intuit-GenSRF/tweets-hate-speech-detection | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 3081848
num_examples: 31962
download_size: 2067751
dataset_size: 3081848
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tweets_hate_speech_detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
james-burton/airbnb_summaries | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: float32
- name: scaled_label
dtype: float64
splits:
- name: train
num_bytes: 12251861
num_examples: 14718
- name: validation
num_bytes: 2605549
num_examples: 3155
- name: test
num_bytes: 2602259
num_examples: 3155
download_size: 10147370
dataset_size: 17459669
---
# Dataset Card for "airbnb_summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
123tozi123/toki-pona-tts | ---
license: cc-by-sa-4.0
task_categories:
- text-to-speech
size_categories:
- n<1K
--- |
mabenan/abap | ---
license: mit
---
|
Fucheng/astro_lens | ---
license: openrail
---
|
heliosprime/twitter_dataset_1712972981 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10130
num_examples: 23
download_size: 9180
dataset_size: 10130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712972981"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Back-up/chung-khoan-demo-p4 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 47642171
num_examples: 8923
download_size: 17034905
dataset_size: 47642171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tdh87/StoryDATA6.0 | ---
license: apache-2.0
---
|
ryL/Taiwan-mandarin | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 332633731.0
num_examples: 846
- name: test
num_bytes: 82886161.0
num_examples: 228
- name: validation
num_bytes: 98951893.0
num_examples: 230
download_size: 513794624
dataset_size: 514471785.0
---
# Dataset Card for "Taiwan-mandarin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xwjiang2010/pile_dedupe_val_tokenized_chunked | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 6937220668
num_examples: 423311
download_size: 3163118066
dataset_size: 6937220668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
abhiram973/MedicalQA | ---
license: mit
---
|
uclgroup8/early-exit-iemocap-embeddings-v3 | ---
dataset_info:
features:
- name: emotion
dtype: string
- name: to_translate
dtype: string
- name: early_audio_embeddings
sequence:
sequence: float64
- name: audio_embeddings
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
- name: text_embeddings
sequence: float32
splits:
- name: train
num_bytes: 88597718
num_examples: 5501
- name: test
num_bytes: 11079348
num_examples: 688
- name: val
num_bytes: 11080896
num_examples: 688
download_size: 97134937
dataset_size: 110757962
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
Ransaka/IAM | ---
dataset_info:
features:
- name: image
dtype: image
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 100524579.707
num_examples: 2913
download_size: 95676516
dataset_size: 100524579.707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-to-text
language:
- en
pretty_name: IAM Handwashing Dataset
size_categories:
- 1K<n<10K
--- |
MingweiMao/Side-view-Pigs | ---
license: other
---
# This dataset consists of pig farming images captured from a side-view perspective.
# After downloading the dataset, place the images and labels in the 'JPEGImages' and 'Annotations' folders under 'VOCdevkit/VOC2007'.
# Running 'VOC.py' will categorize the data into training, validation, and test datasets according to specified ratios in VOC format.
# Running 'voc-yolo.py' will categorize the data into training, validation, and test datasets in YOLO format with specified ratios.
# By following the aforementioned steps, you can obtain the VOC and YOLO formats for this side-view-pigs dataset
---
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.