datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
emrecan/nli_tr_for_simcse | ---
language:
- tr
size_categories:
- 100K<n<1M
source_datasets:
- nli_tr
task_categories:
- text-classification
task_ids:
- semantic-similarity-scoring
- text-scoring
---
# NLI-TR for Supervised SimCSE
This dataset is a modified version of [NLI-TR](https://huggingface.co/datasets/nli_tr) dataset. Its intended use is to train Supervised [SimCSE](https://github.com/princeton-nlp/SimCSE) models for sentence-embeddings. Steps followed to produce this dataset are listed below:
1. Merge train split of snli_tr and multinli_tr subsets.
2. Find every premise that has an entailment hypothesis **and** a contradiction hypothesis.
3. Write found triplets into sent0 (premise), sent1 (entailment hypothesis), hard_neg (contradiction hypothesis) format.
See this [Colab Notebook](https://colab.research.google.com/drive/1Ysq1SpFOa7n1X79x2HxyWjfKzuR_gDQV?usp=sharing) for training and evaluation on Turkish sentences. |
open-llm-leaderboard/details_mindy-labs__mindy-7b-v2 | ---
pretty_name: Evaluation run of mindy-labs/mindy-7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mindy-labs/mindy-7b-v2](https://huggingface.co/mindy-labs/mindy-7b-v2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mindy-labs__mindy-7b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-21T18:22:51.264759](https://huggingface.co/datasets/open-llm-leaderboard/details_mindy-labs__mindy-7b-v2/blob/main/results_2023-12-21T18-22-51.264759.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6558321041397203,\n\
\ \"acc_stderr\": 0.03207006697624872,\n \"acc_norm\": 0.6560363290954173,\n\
\ \"acc_norm_stderr\": 0.0327312814050994,\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6016405207483612,\n\
\ \"mc2_stderr\": 0.015192119540299543\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6535836177474402,\n \"acc_stderr\": 0.013905011180063235,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623492\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.678550089623581,\n\
\ \"acc_stderr\": 0.004660785616933756,\n \"acc_norm\": 0.8658633738299144,\n\
\ \"acc_norm_stderr\": 0.0034010255178737263\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n\
\ \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n\
\ \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"\
acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630872,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630872\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n\
\ \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n\
\ \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944863,\n\
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944863\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n\
\ \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6016405207483612,\n\
\ \"mc2_stderr\": 0.015192119540299543\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \
\ \"acc_stderr\": 0.012503592481818957\n }\n}\n```"
repo_url: https://huggingface.co/mindy-labs/mindy-7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|arc:challenge|25_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|gsm8k|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hellaswag|10_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-21T18-22-51.264759.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-21T18-22-51.264759.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- '**/details_harness|winogrande|5_2023-12-21T18-22-51.264759.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-21T18-22-51.264759.parquet'
- config_name: results
data_files:
- split: 2023_12_21T18_22_51.264759
path:
- results_2023-12-21T18-22-51.264759.parquet
- split: latest
path:
- results_2023-12-21T18-22-51.264759.parquet
---
# Dataset Card for Evaluation run of mindy-labs/mindy-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mindy-labs/mindy-7b-v2](https://huggingface.co/mindy-labs/mindy-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mindy-labs__mindy-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-21T18:22:51.264759](https://huggingface.co/datasets/open-llm-leaderboard/details_mindy-labs__mindy-7b-v2/blob/main/results_2023-12-21T18-22-51.264759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6558321041397203,
"acc_stderr": 0.03207006697624872,
"acc_norm": 0.6560363290954173,
"acc_norm_stderr": 0.0327312814050994,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6016405207483612,
"mc2_stderr": 0.015192119540299543
},
"harness|arc:challenge|25": {
"acc": 0.6535836177474402,
"acc_stderr": 0.013905011180063235,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623492
},
"harness|hellaswag|10": {
"acc": 0.678550089623581,
"acc_stderr": 0.004660785616933756,
"acc_norm": 0.8658633738299144,
"acc_norm_stderr": 0.0034010255178737263
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630872,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630872
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944863,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944863
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6016405207483612,
"mc2_stderr": 0.015192119540299543
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989247
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.012503592481818957
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Etienne-David/GlobalWheatHeadDataset2021 | ---
language:
- en
license: cc-by-4.0
task_categories:
- object-detection
pretty_name: Global Wheat Head
tags:
- agriculture
- biology
dataset_info:
features:
- name: image
dtype: image
- name: domain
dtype: string
- name: country
dtype: string
- name: location
dtype: string
- name: development_stage
dtype: string
- name: objects
struct:
- name: boxes
sequence:
sequence: int64
- name: categories
sequence: int64
splits:
- name: train
num_bytes: 701105106.93
num_examples: 3655
- name: validation
num_bytes: 264366740.324
num_examples: 1476
- name: test
num_bytes: 301053063.17
num_examples: 1381
download_size: 1260938177
dataset_size: 1266524910.424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "Global Wheat Head Dataset 2021" 😊
If you want any update on the Global Wheat Dataset Community, go on https://www.global-wheat.com/
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Composition](#dataset-composition)
- [Usage](#usage)
- [Citation](#citation)
- [Acknowledgements](#acknowledgements)
## Dataset Description
- **Creators**: Etienne David and others
- **Published**: July 12, 2021 | Version 1.0
- **Availability**: [Zenodo Link](https://doi.org/10.5281/zenodo.5092309)
- **Keywords**: Deep Learning, Wheat Counting, Plant Phenotyping
### Introduction
Wheat is essential for a large part of humanity. The "Global Wheat Head Dataset 2021" aims to support the development of deep learning models for wheat head detection. This dataset addresses challenges like overlapping plants and varying conditions across global wheat fields. It's a step towards automating plant phenotyping and enhancing agricultural practices. 🌾
### Dataset Composition
- **Images**: Over 6000, Resolution - 1024x1024 pixels
- **Annotations**: 300k+ unique wheat heads with bounding boxes
- **Geographic Coverage**: Images from 11 countries
- **Domains**: Various, including sensor types and locations
- **Splits**: Training (Europe & Canada), Test (Other regions)
## Dataset Composition
### Files and Structure
- **Images**: Folder containing all images (`.png`)
- **CSV Files**: `competition_train.csv`, `competition_val.csv`, `competition_test.csv` for different dataset splits
- **Metadata**: `Metadata.csv` with additional details
### Labels
- **Format**: CSV with columns - image_name, BoxesString, domain
- **BoxesString**: `[x_min,y_min, x_max,y_max]` format for bounding boxes
- **Domain**: Specifies the image domain
## Usage
### Tutorials and Resources
- Tutorials available at [AIcrowd Challenge Page](https://www.aicrowd.com/challenges/global-wheat-challenge-2021)
### License
- **Type**: Creative Commons Attribution 4.0 International (cc-by-4.0)
- **Details**: Free to use with attribution
## Citation
If you use this dataset in your research, please cite the following:
```bibtex
@article{david2020global,
title={Global Wheat Head Detection (GWHD) dataset: a large and diverse dataset of high-resolution RGB-labelled images to develop and benchmark wheat head detection methods},
author={David, Etienne and others},
journal={Plant Phenomics},
volume={2020},
year={2020},
publisher={Science Partner Journal}
}
@misc{david2021global,
title={Global Wheat Head Dataset 2021: more diversity to improve the benchmarking of wheat head localization methods},
author={Etienne David and others},
year={2021},
eprint={2105.07660},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## Acknowledgements
Special thanks to all the contributors, researchers, and institutions that played a pivotal role in the creation of this dataset. Your efforts are helping to advance the field of agricultural sciences and technology. 👏
|
milesbutler/consumer_complaints | ---
license: mit
---
This Dataset is from Kaggle. It originally comes from the US Consumer Finance Complaints. This is great dataset for NLP multi-class classification.
|
heliosprime/twitter_dataset_1713161053 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8453
num_examples: 24
download_size: 11144
dataset_size: 8453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713161053"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
renumics/spotlight-osunlp-MagicBrush-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: img_id.embedding
sequence: float32
length: 2
- name: source_img.embedding
sequence: float32
length: 2
- name: mask_img.embedding
sequence: float32
length: 2
- name: instruction.embedding
sequence: float32
length: 2
- name: target_img.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 352280
num_examples: 8807
- name: dev
num_bytes: 21120
num_examples: 528
download_size: 524053
dataset_size: 373400
---
# Dataset Card for "spotlight-osunlp-MagicBrush-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HKBU-NLP/GOAT-Bench | ---
language:
- en
---
# The GOAT Benchmark ([HomePage](https://goatlmm.github.io/))

We introduce the GOAT-Bench, a comprehensive and specialized dataset designed to evaluate large multimodal models through meme-based multimodal social abuse. GOAT-Bench comprises over 6K diverse memes, encompassing a range of themes including hate speech and offensive content. Our focus is to assess the ability of LMMs to accurately identify online abuse, specifically in terms of hatefulness, misogyny, offensiveness, sarcasm, and harmfulness. We meticulously control for the granularity of each specific meme task to facilitate a detailed analysis. Furthermore, we extend our evaluation to assess the effectiveness of thought chains in discerning the underlying implications of memes for deducing their potential threat to safety.
# Experiment Results



# BibTeX
```
@misc{lin2024goatbench,
title={GOAT-Bench: Safety Insights to Large Multimodal Models through Meme-Based Social Abuse},
author={Hongzhan Lin and Ziyang Luo and Bo Wang and Ruichao Yang and Jing Ma},
year={2024},
eprint={2401.01523},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
# Ethics and Broader Impact
The aim of this research focuses on the safety issue related to LMMs, to curb the dissemination of abusive memes and protect individuals from exposure to bias, racial, and gender-based discrimination. However, we acknowledge the risk that malicious actors might attempt to reverse-engineer memes that could evade detection by AI systems trained on LMMs. We vehemently discourage and denounce such practices, and emphasize that human moderation is essential to prevent such occurrences. Aware of the potential psychological impact on those evaluating abusive content, we have instituted protective measures for our human evaluators, including: 1) explicit consent regarding exposure to potentially abusive content, 2) a cap on weekly evaluations to manage exposure and advocate for reasonable daily workloads, and 3) recommendations to discontinue their review should they experience distress. We also conduct regular well-being checks to monitor their mental health. Additionally, the use of Facebook’s meme dataset necessitates adherence to Facebook’s terms of use; our use of these memes complies with these terms. It is important to note that all data organized are restricted to meme content and do not include any personal user data.
# License
For the tasks encompassing Misogyny, Offensiveness, Sarcasm, and Harmfulness, the data is provided under the MIT license.
Regarding the task of Hatefulness, the usage of Facebook’s hateful meme dataset requires compliance with Facebook's terms of use. Our utilization of these memes adheres to these terms.
In alignment with Facebook’s licensing conditions for the memes, the GOAT-Bench includes only the annotated text for the Facebook data, and not the actual hateful memes. Users interested in accessing these memes must download them separately from the Facebook Hateful Meme Challenge website: https://hatefulmemeschallenge.com/#download. |
iaaoli2/arianaw2 | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713075128 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13461
num_examples: 28
download_size: 10808
dataset_size: 13461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713075128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cfilt/HiNER-original | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- hi
license: "cc-by-sa-4.0"
multilinguality:
- monolingual
paperswithcode_id: hiner-original-1
pretty_name: HiNER - Large Hindi Named Entity Recognition dataset
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
<p align="center"><img src="https://huggingface.co/datasets/cfilt/HiNER-collapsed/raw/main/cfilt-dark-vec.png" alt="Computation for Indian Language Technology Logo" width="150" height="150"/></p>
# Dataset Card for HiNER-original
[](https://twitter.com/cfiltnlp)
[](https://twitter.com/PeopleCentredAI)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/cfiltnlp/HiNER
- **Repository:** https://github.com/cfiltnlp/HiNER
- **Paper:** https://arxiv.org/abs/2204.13743
- **Leaderboard:** https://paperswithcode.com/sota/named-entity-recognition-on-hiner-original
- **Point of Contact:** Rudra Murthy V
### Dataset Summary
This dataset was created for the fundamental NLP task of Named Entity Recognition for the Hindi language at CFILT Lab, IIT Bombay. We gathered the dataset from various government information webpages and manually annotated these sentences as a part of our data collection strategy.
**Note:** The dataset contains sentences from ILCI and other sources. ILCI dataset requires license from Indian Language Consortium due to which we do not distribute the ILCI portion of the data. Please send us a mail with proof of ILCI data acquisition to obtain the full dataset.
### Supported Tasks and Leaderboards
Named Entity Recognition
### Languages
Hindi
## Dataset Structure
### Data Instances
{'id': '0', 'tokens': ['प्राचीन', 'समय', 'में', 'उड़ीसा', 'को', 'कलिंग','के', 'नाम', 'से', 'जाना', 'जाता', 'था', '।'], 'ner_tags': [0, 0, 0, 3, 0, 3, 0, 0, 0, 0, 0, 0, 0]}
### Data Fields
- `id`: The ID value of the data point.
- `tokens`: Raw tokens in the dataset.
- `ner_tags`: the NER tags for this dataset.
### Data Splits
| | Train | Valid | Test |
| ----- | ------ | ----- | ---- |
| original | 76025 | 10861 | 21722|
| collapsed | 76025 | 10861 | 21722|
## About
This repository contains the Hindi Named Entity Recognition dataset (HiNER) published at the Langauge Resources and Evaluation conference (LREC) in 2022. A pre-print via arXiv is available [here](https://arxiv.org/abs/2204.13743).
### Recent Updates
* Version 0.0.5: HiNER initial release
## Usage
You should have the 'datasets' packages installed to be able to use the :rocket: HuggingFace datasets repository. Please use the following command and install via pip:
```code
pip install datasets
```
To use the original dataset with all the tags, please use:<br/>
```python
from datasets import load_dataset
hiner = load_dataset('cfilt/HiNER-original')
```
To use the collapsed dataset with only PER, LOC, and ORG tags, please use:<br/>
```python
from datasets import load_dataset
hiner = load_dataset('cfilt/HiNER-collapsed')
```
However, the CoNLL format dataset files can also be found on this Git repository under the [data](data/) folder.
## Model(s)
Our best performing models are hosted on the HuggingFace models repository:
1. [HiNER-Collapsed-XLM-R](https://huggingface.co/cfilt/HiNER-Collapse-XLM-Roberta-Large)
2. [HiNER-Original-XLM-R](https://huggingface.co/cfilt/HiNER-Original-XLM-Roberta-Large)
## Dataset Creation
### Curation Rationale
HiNER was built on data extracted from various government websites handled by the Government of India which provide information in Hindi. This dataset was built for the task of Named Entity Recognition. The dataset was introduced to introduce new resources to the Hindi language that was under-served for Natural Language Processing.
### Source Data
#### Initial Data Collection and Normalization
HiNER was built on data extracted from various government websites handled by the Government of India which provide information in Hindi
#### Who are the source language producers?
Various Government of India webpages
### Annotations
#### Annotation process
This dataset was manually annotated by a single annotator of a long span of time.
#### Who are the annotators?
Pallab Bhattacharjee
### Personal and Sensitive Information
We ensured that there was no sensitive information present in the dataset. All the data points are curated from publicly available information.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to provide a large Hindi Named Entity Recognition dataset. Since the information (data points) has been obtained from public resources, we do not think there is a negative social impact in releasing this data.
### Discussion of Biases
Any biases contained in the data released by the Indian government are bound to be present in our data.
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
Pallab Bhattacharjee
### Licensing Information
CC-BY-SA 4.0
### Citation Information
```latex
@misc{https://doi.org/10.48550/arxiv.2204.13743,
doi = {10.48550/ARXIV.2204.13743},
url = {https://arxiv.org/abs/2204.13743},
author = {Murthy, Rudra and Bhattacharjee, Pallab and Sharnagat, Rahul and Khatri, Jyotsana and Kanojia, Diptesh and Bhattacharyya, Pushpak},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {HiNER: A Large Hindi Named Entity Recognition Dataset},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` |
davanstrien/wikiart-resized | ---
dataset_info:
features:
- name: image
dtype: image
- name: artist
dtype:
class_label:
names:
'0': Unknown Artist
'1': boris-kustodiev
'2': camille-pissarro
'3': childe-hassam
'4': claude-monet
'5': edgar-degas
'6': eugene-boudin
'7': gustave-dore
'8': ilya-repin
'9': ivan-aivazovsky
'10': ivan-shishkin
'11': john-singer-sargent
'12': marc-chagall
'13': martiros-saryan
'14': nicholas-roerich
'15': pablo-picasso
'16': paul-cezanne
'17': pierre-auguste-renoir
'18': pyotr-konchalovsky
'19': raphael-kirchner
'20': rembrandt
'21': salvador-dali
'22': vincent-van-gogh
'23': hieronymus-bosch
'24': leonardo-da-vinci
'25': albrecht-durer
'26': edouard-cortes
'27': sam-francis
'28': juan-gris
'29': lucas-cranach-the-elder
'30': paul-gauguin
'31': konstantin-makovsky
'32': egon-schiele
'33': thomas-eakins
'34': gustave-moreau
'35': francisco-goya
'36': edvard-munch
'37': henri-matisse
'38': fra-angelico
'39': maxime-maufra
'40': jan-matejko
'41': mstislav-dobuzhinsky
'42': alfred-sisley
'43': mary-cassatt
'44': gustave-loiseau
'45': fernando-botero
'46': zinaida-serebriakova
'47': georges-seurat
'48': isaac-levitan
'49': joaquãn-sorolla
'50': jacek-malczewski
'51': berthe-morisot
'52': andy-warhol
'53': arkhip-kuindzhi
'54': niko-pirosmani
'55': james-tissot
'56': vasily-polenov
'57': valentin-serov
'58': pietro-perugino
'59': pierre-bonnard
'60': ferdinand-hodler
'61': bartolome-esteban-murillo
'62': giovanni-boldini
'63': henri-martin
'64': gustav-klimt
'65': vasily-perov
'66': odilon-redon
'67': tintoretto
'68': gene-davis
'69': raphael
'70': john-henry-twachtman
'71': henri-de-toulouse-lautrec
'72': antoine-blanchard
'73': david-burliuk
'74': camille-corot
'75': konstantin-korovin
'76': ivan-bilibin
'77': titian
'78': maurice-prendergast
'79': edouard-manet
'80': peter-paul-rubens
'81': aubrey-beardsley
'82': paolo-veronese
'83': joshua-reynolds
'84': kuzma-petrov-vodkin
'85': gustave-caillebotte
'86': lucian-freud
'87': michelangelo
'88': dante-gabriel-rossetti
'89': felix-vallotton
'90': nikolay-bogdanov-belsky
'91': georges-braque
'92': vasily-surikov
'93': fernand-leger
'94': konstantin-somov
'95': katsushika-hokusai
'96': sir-lawrence-alma-tadema
'97': vasily-vereshchagin
'98': ernst-ludwig-kirchner
'99': mikhail-vrubel
'100': orest-kiprensky
'101': william-merritt-chase
'102': aleksey-savrasov
'103': hans-memling
'104': amedeo-modigliani
'105': ivan-kramskoy
'106': utagawa-kuniyoshi
'107': gustave-courbet
'108': william-turner
'109': theo-van-rysselberghe
'110': joseph-wright
'111': edward-burne-jones
'112': koloman-moser
'113': viktor-vasnetsov
'114': anthony-van-dyck
'115': raoul-dufy
'116': frans-hals
'117': hans-holbein-the-younger
'118': ilya-mashkov
'119': henri-fantin-latour
'120': m.c.-escher
'121': el-greco
'122': mikalojus-ciurlionis
'123': james-mcneill-whistler
'124': karl-bryullov
'125': jacob-jordaens
'126': thomas-gainsborough
'127': eugene-delacroix
'128': canaletto
- name: genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
'10': Unknown Genre
- name: style
dtype:
class_label:
names:
'0': Abstract_Expressionism
'1': Action_painting
'2': Analytical_Cubism
'3': Art_Nouveau
'4': Baroque
'5': Color_Field_Painting
'6': Contemporary_Realism
'7': Cubism
'8': Early_Renaissance
'9': Expressionism
'10': Fauvism
'11': High_Renaissance
'12': Impressionism
'13': Mannerism_Late_Renaissance
'14': Minimalism
'15': Naive_Art_Primitivism
'16': New_Realism
'17': Northern_Renaissance
'18': Pointillism
'19': Pop_Art
'20': Post_Impressionism
'21': Realism
'22': Rococo
'23': Romanticism
'24': Symbolism
'25': Synthetic_Cubism
'26': Ukiyo_e
splits:
- name: train
num_bytes: 5066964513.5
num_examples: 81444
download_size: 5065060725
dataset_size: 5066964513.5
tags:
- art
- 'lam '
size_categories:
- 10K<n<100K
---
# Dataset Card for "wikiart-resized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Apinapi/Pamonha | ---
license: openrail
---
|
schwepat/anonymized-amazon-dataset | ---
license: mit
---
|
CVasNLPExperiments/VQAv2_minival_validation_google_flan_t5_xxl_mode_T_A_CM_Q_rices_ns_25994 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 160199562
num_examples: 25994
download_size: 25173468
dataset_size: 160199562
---
# Dataset Card for "VQAv2_minival_validation_google_flan_t5_xxl_mode_T_A_CM_Q_rices_ns_25994"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JelleWo/vox_populi_en_VALTEST_pseudo_labelled | ---
dataset_info:
config_name: en
features:
- name: audio_id
dtype: string
- name: language
dtype:
class_label:
names:
'0': en
'1': de
'2': fr
'3': es
'4': pl
'5': it
'6': ro
'7': hu
'8': cs
'9': nl
'10': fi
'11': hr
'12': sk
'13': sl
'14': et
'15': lt
'16': en_accented
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: raw_text
dtype: string
- name: normalized_text
dtype: string
- name: gender
dtype: string
- name: speaker_id
dtype: string
- name: is_gold_transcript
dtype: bool
- name: accent
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: validation
num_bytes: 1149008063.766
num_examples: 1753
- name: test
num_bytes: 1144657521.808
num_examples: 1842
download_size: 1878566845
dataset_size: 2293665585.5740004
configs:
- config_name: en
data_files:
- split: validation
path: en/validation-*
- split: test
path: en/test-*
---
|
Papersnake/people_daily_news | ---
license: cc0-1.0
---
# 人民日报(1946-2023)数据集
The dataset is part of CialloCorpus, available at https://github.com/prnake/CialloCorpus
|
TheGreatP/vozjoaoV15 | ---
license: openrail
---
|
Michael823/semantic-try2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3347017.0
num_examples: 10
- name: validation
num_bytes: 834103.0
num_examples: 3
download_size: 4200704
dataset_size: 4181120.0
---
# Dataset Card for "semantic-try2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/ultrachat_200k_filtered_1708381525 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_sft
num_bytes: 21419543.582
num_examples: 541
- name: train_sft
num_bytes: 22566624.043
num_examples: 571
download_size: 11106700
dataset_size: 43986167.625
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=None,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=1000,
max_sft_query_response_length=2000,
max_sft_response_length=1000,
max_rm_query_response_length=2000,
max_rm_response_length=1000),
'push_to_hub': True}
```
|
sruly/lamed-data | ---
license: apache-2.0
---
|
alif75/tes | ---
license: unknown
---
|
version-control/ds-lib-extract-1m | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: hexsha
dtype: string
- name: file_path
dtype: string
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 26302
num_examples: 6
download_size: 28869
dataset_size: 26302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
crcj/crcj | ---
license: apache-2.0
---
|
ChristophSchuhmann/wikipedia-3sentence-level-retrieval-index | ---
license: apache-2.0
---
https://youtu.be/8FS0oUB-eCI |
ylacombe/google-chilean-spanish | ---
dataset_info:
- config_name: female
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 974926631.856
num_examples: 1738
download_size: 762982190
dataset_size: 974926631.856
- config_name: male
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 1472568181.048
num_examples: 2636
download_size: 1133624286
dataset_size: 1472568181.048
configs:
- config_name: female
data_files:
- split: train
path: female/train-*
- config_name: male
data_files:
- split: train
path: male/train-*
task_categories:
- text-to-speech
- text-to-audio
language:
- es
pretty_name: Chilean Spanish Speech
license: cc-by-sa-4.0
---
# Dataset Card for Tamil Speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Statistics](#data-statistics)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Crowdsourced high-quality Chilean Spanish speech data set.](https://www.openslr.org/71/)
- **Repository:** [Google Language Resources and Tools](https://github.com/google/language-resources)
- **Paper:** [Crowdsourcing Latin American Spanish for Low-Resource Text-to-Speech](https://aclanthology.org/2020.lrec-1.801/)
### Dataset Summary
This dataset consists of 7 hours of transcribed high-quality audio of Chilean Spanish sentences recorded by 31 volunteers. The dataset is intended for speech technologies.
The data archives were restructured from the original ones from [OpenSLR](http://www.openslr.org/71/) to make it easier to stream.
### Supported Tasks
- `text-to-speech`, `text-to-audio`: The dataset can be used to train a model for Text-To-Speech (TTS).
- `automatic-speech-recognition`, `speaker-identification`: The dataset can also be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the female config, simply specify the corresponding language config name (i.e., "female" for female speakers):
```python
from datasets import load_dataset
dataset =load_dataset("ylacombe/google-chilean-spanish", "female", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
dataset =load_dataset("ylacombe/google-chilean-spanish", "female", split="train", streaming=True)
print(next(iter(dataset)))
```
#### *Bonus*
You can create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
**Local:**
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
dataset =load_dataset("ylacombe/google-chilean-spanish", "female", split="train")
batch_sampler = BatchSampler(RandomSampler(dataset), batch_size=32, drop_last=False)
dataloader = DataLoader(dataset, batch_sampler=batch_sampler)
```
**Streaming:**
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
dataset =load_dataset("ylacombe/google-chilean-spanish", "female", split="train", streaming=True)
dataloader = DataLoader(dataset, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file called `audio` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'audio': {'path': 'clf_09334_01278378087.wav', 'array': array([-9.15527344e-05, -4.57763672e-04, -4.88281250e-04, ...,
1.86157227e-03, 2.10571289e-03, 2.31933594e-03]), 'sampling_rate': 48000}, 'text': 'La vigencia de tu tarjeta es de ocho meses', 'speaker_id': 9334}
```
### Data Fields
- audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
### Data Statistics
| | Total duration (h) | # speakers | # sentences | # total words | # unique words |
|--------|--------------------|------------|-------------|---------------|----------------|
| Female | 2.84 | 13 | 1738 | 16591 | 3279 |
| Male | 4.31 | 18 | 2636 | 25168 | 4171 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
License: ([CC BY-SA 4.0 DEED](https://creativecommons.org/licenses/by-sa/4.0/deed.en))
### Citation Information
```
@inproceedings{guevara-rukoz-etal-2020-crowdsourcing,
title = {{Crowdsourcing Latin American Spanish for Low-Resource Text-to-Speech}},
author = {Guevara-Rukoz, Adriana and Demirsahin, Isin and He, Fei and Chu, Shan-Hui Cathy and Sarin, Supheakmungkol and Pipatsrisawat, Knot and Gutkin, Alexander and Butryna, Alena and Kjartansson, Oddur},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference (LREC)},
year = {2020},
month = may,
address = {Marseille, France},
publisher = {European Language Resources Association (ELRA)},
url = {https://www.aclweb.org/anthology/2020.lrec-1.801},
pages = {6504--6513},
ISBN = {979-10-95546-34-4},
}
```
### Contributions
Thanks to [@ylacombe](https://github.com/ylacombe) for adding this dataset. |
distilled-from-one-sec-cv12/chunk_92 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1272284384
num_examples: 247912
download_size: 1299244641
dataset_size: 1272284384
---
# Dataset Card for "chunk_92"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_psychology-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 147908
num_examples: 545
download_size: 86072
dataset_size: 147908
---
# Dataset Card for "mmlu-high_school_psychology-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sam-chami/asturian | ---
license: wtfpl
---
|
liuyanchen1015/MULTI_VALUE_qqp_non_coordinated_subj_obj | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 383720
num_examples: 1875
- name: test
num_bytes: 3949111
num_examples: 19434
- name: train
num_bytes: 3561612
num_examples: 17172
download_size: 4867281
dataset_size: 7894443
---
# Dataset Card for "MULTI_VALUE_qqp_non_coordinated_subj_obj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PNLPhub/Pars-ABSA | ---
license: mit
---
|
Harene/guanaco-llama2-100-rows | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 184326
num_examples: 100
download_size: 111858
dataset_size: 184326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-100-rows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KentoTsu/laga | ---
license: openrail
---
|
SilentSpeak/EGCLLC | ---
license: cc-by-4.0
language:
- en
size_categories:
- 10K<n<100K
---
# Enhanced GRID Corpus with Lip Landmark Coordinates
## Introduction
This enhanced version of the GRID audiovisual sentence corpus, originally available at [Zenodo](https://zenodo.org/records/3625687), incorporates significant new features for auditory-visual speech recognition research. Building upon the preprocessed data from [LipNet-PyTorch](https://github.com/VIPL-Audio-Visual-Speech-Understanding/LipNet-PyTorch), we have added lip landmark coordinates to the dataset, providing detailed positional information of key points around the lips. This addition greatly enhances its utility in visual speech recognition and related fields. Furthermore, to facilitate ease of access and integration into existing machine learning workflows, we have published this enriched dataset on the Hugging Face platform, making it readily available to the research community.
## Dataset Structure
This dataset is split into 3 directories:
- `lip_images`: contains the images of the lips
- `speaker_id`: contains the videos of a particular speaker
- `video_id`: contains the video frames of a particular video
- `frame_no.jpg`: jpg image of the lips of a particular frame
- `lip_coordinates`: contains the landmark coordinates of the lips
- `speaker_id`: contains the lip landmark of a particular speaker
- `video_id.json`: a json file containing the lip landmark coordinates of a particular video, where the keys are the frame numbers and the values are the x, y lip landmark coordinates
- `GRID_alignments`: contains the alignments of all the videos in the dataset
- `speaker_id`: contains the alignments of a particular speaker
- `video_id.align`: contains the alignments of a particular video, where each line is a word and the start and end time of the word in the video
## Details
The lip landmark coordinates are extracted using the original videos in the GRID corpus and using the dlib library, using the [shape_predictor_68_face_landmarks_GTX.dat](https://github.com/davisking/dlib-models) pretrained model. The lip landmark coordinates are then saved in a json file, where the keys are the frame numbers and the values are the x, y lip landmark coordinates. The lip landmark coordinates are saved in the same order as the frames in the video.
## Usage
The dataset can be downloaded by cloning this repository.
### Cloning the repository
```bash
git clone https://huggingface.co/datasets/SilentSpeak/EGCLLC
```
### Loading the dataset
After cloning the repository, you can load the dataset by unpacking the tar file and using dataset_tar.py script.
Alternatively, a probably faster method is that, you can un-tar the tar files using the following command:
```bash
tar -xvf lip_images.tar
tar -xvf lip_coordinates.tar
tar -xvf GRID_alignments.tar
```
## Acknowledgements
Alvarez Casado, C., Bordallo Lopez, M.
Real-time face alignment: evaluation methods, training strategies and implementation optimization.
Springer Journal of Real-time image processing, 2021
Assael, Y., Shillingford, B., Whiteson, S., & Freitas, N. (2017). LipNet: End-to-End Sentence-level Lipreading. GPU Technology Conference.
Cooke, M., Barker, J., Cunningham, S., & Shao, X. (2006). The Grid Audio-Visual Speech Corpus (1.0) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.3625687
|
suvadityamuk/unifyai-ivy-code-dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: metadata
struct:
- name: file_path
dtype: string
- name: repo_id
dtype: string
- name: token_count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 13572925
num_examples: 1131
download_size: 0
dataset_size: 13572925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "unifyai-ivy-code-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EiffL/AstroCLIP | ---
license: mit
---
|
Falah/black_and_white_photography_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 190817
num_examples: 1000
download_size: 4063
dataset_size: 190817
---
# Dataset Card for "black_and_white_photography_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_PygmalionAI__metharme-1.3b | ---
pretty_name: Evaluation run of PygmalionAI/metharme-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PygmalionAI/metharme-1.3b](https://huggingface.co/PygmalionAI/metharme-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__metharme-1.3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T18:39:45.920651](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__metharme-1.3b/blob/main/results_2023-09-22T18-39-45.920651.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.04728187919463099,\n\
\ \"f1_stderr\": 0.0012123660755283244,\n \"acc\": 0.2859533393610357,\n\
\ \"acc_stderr\": 0.008162495625846476\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n\
\ \"f1\": 0.04728187919463099,\n \"f1_stderr\": 0.0012123660755283244\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.002389281512077243\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5643251775848461,\n \"acc_stderr\": 0.01393570973961571\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PygmalionAI/metharme-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T18_39_45.920651
path:
- '**/details_harness|drop|3_2023-09-22T18-39-45.920651.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T18-39-45.920651.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T18_39_45.920651
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-39-45.920651.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T18-39-45.920651.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:50:43.188696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:50:43.188696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T18_39_45.920651
path:
- '**/details_harness|winogrande|5_2023-09-22T18-39-45.920651.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T18-39-45.920651.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_50_43.188696
path:
- results_2023-07-19T14:50:43.188696.parquet
- split: 2023_09_22T18_39_45.920651
path:
- results_2023-09-22T18-39-45.920651.parquet
- split: latest
path:
- results_2023-09-22T18-39-45.920651.parquet
---
# Dataset Card for Evaluation run of PygmalionAI/metharme-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PygmalionAI/metharme-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PygmalionAI/metharme-1.3b](https://huggingface.co/PygmalionAI/metharme-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PygmalionAI__metharme-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T18:39:45.920651](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__metharme-1.3b/blob/main/results_2023-09-22T18-39-45.920651.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.04728187919463099,
"f1_stderr": 0.0012123660755283244,
"acc": 0.2859533393610357,
"acc_stderr": 0.008162495625846476
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.04728187919463099,
"f1_stderr": 0.0012123660755283244
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077243
},
"harness|winogrande|5": {
"acc": 0.5643251775848461,
"acc_stderr": 0.01393570973961571
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GeorgeBredis/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3118843.0
num_examples: 42
download_size: 3118955
dataset_size: 3118843.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kirishima_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kirishima/霧島/雾岛 (Azur Lane)
This is the dataset of kirishima/霧島/雾岛 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `horns, purple_eyes, breasts, bangs, short_hair, purple_hair, hair_between_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 51.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirishima_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 29.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirishima_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 101 | 62.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirishima_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 45.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirishima_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 101 | 84.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kirishima_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kirishima_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, sunglasses, eyewear_on_head, bare_shoulders, blush, choker, collarbone, bikini, bracelet, crop_top_overhang, heart_cutout, midriff, navel, short_shorts, thigh_strap, cleavage_cutout, denim_shorts, nail_polish, off-shoulder_shirt, official_alternate_costume, white_shirt, brown_hair, closed_mouth, cowboy_shot, day, highleg, outdoors |
| 1 | 10 |  |  |  |  |  | 1girl, glasses, looking_at_viewer, red-framed_eyewear, solo, school_uniform, pleated_skirt, red_necktie, simple_background, smile, under-rim_eyewear, black_pantyhose, white_shirt, black_skirt, collared_shirt, katana, long_sleeves, sheath, white_background, blush, brown_hair, closed_mouth, holding, petals, striped_necktie, bag, open_jacket, sweater_vest |
| 2 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bare_shoulders, white_background, full_body, holding_weapon, medium_breasts, ninja_mask, simple_background, white_gloves, katana, kimono, smile, turret |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | sunglasses | eyewear_on_head | bare_shoulders | blush | choker | collarbone | bikini | bracelet | crop_top_overhang | heart_cutout | midriff | navel | short_shorts | thigh_strap | cleavage_cutout | denim_shorts | nail_polish | off-shoulder_shirt | official_alternate_costume | white_shirt | brown_hair | closed_mouth | cowboy_shot | day | highleg | outdoors | glasses | red-framed_eyewear | school_uniform | pleated_skirt | red_necktie | simple_background | under-rim_eyewear | black_pantyhose | black_skirt | collared_shirt | katana | long_sleeves | sheath | white_background | holding | petals | striped_necktie | bag | open_jacket | sweater_vest | full_body | holding_weapon | medium_breasts | ninja_mask | white_gloves | kimono | turret |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:-------------|:------------------|:-----------------|:--------|:---------|:-------------|:---------|:-----------|:--------------------|:---------------|:----------|:--------|:---------------|:--------------|:------------------|:---------------|:--------------|:---------------------|:-----------------------------|:--------------|:-------------|:---------------|:--------------|:------|:----------|:-----------|:----------|:---------------------|:-----------------|:----------------|:--------------|:--------------------|:--------------------|:------------------|:--------------|:-----------------|:---------|:---------------|:---------|:-------------------|:----------|:---------|:------------------|:------|:--------------|:---------------|:------------|:-----------------|:-----------------|:-------------|:---------------|:---------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | X | | | | | | | X | X | X | X | X | X | X |
|
Ingrid0693/openAiAssistent | ---
license: mit
dataset_info:
features:
- name: source_text
dtype: string
- name: target_text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8808590
num_examples: 7856
- name: validation
num_bytes: 459684
num_examples: 418
download_size: 5263182
dataset_size: 9268274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
rahular/simple-wikipedia | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 144689943
num_examples: 769764
download_size: 86969379
dataset_size: 144689943
---
# simple-wikipedia
Processed, text-only dump of the Simple Wikipedia (English). Contains 23,886,673 words. |
ranimeree/CycleGAN_ConstSceneSnowyImages | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 301061865.237
num_examples: 2769
- name: validation
num_bytes: 61731350.0
num_examples: 352
- name: test
num_bytes: 61079974.0
num_examples: 348
download_size: 410766724
dataset_size: 423873189.237
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
geovaneand/diretor | ---
license: openrail
---
|
projectbaraat/hindi-qa-data-v0.1 | ---
language:
- hi
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 334492647
num_examples: 167574
download_size: 74390742
dataset_size: 334492647
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lez94/take-off-eyeglasses-200 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edited_image
dtype: image
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 20494816.0
num_examples: 200
download_size: 20513907
dataset_size: 20494816.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tr416/dataset_20231006_193224 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73841
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_193224"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ricardo-larosa/SWE-bench_Lite_Dev_Extended | ---
dataset_info:
features:
- name: repo
dtype: string
- name: instance_id
dtype: string
- name: base_commit
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
- name: file_path
dtype: string
- name: file_content
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 2089768
num_examples: 23
download_size: 773748
dataset_size: 2089768
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
|
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench7](https://huggingface.co/Undi95/Mistral-11B-TestBench7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T16:09:31.642289](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7/blob/main/results_2023-10-11T16-09-31.642289.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6399052867360159,\n\
\ \"acc_stderr\": 0.03310704632621164,\n \"acc_norm\": 0.6439213227226402,\n\
\ \"acc_norm_stderr\": 0.03308447285363473,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4691495265456508,\n\
\ \"mc2_stderr\": 0.014857248788144817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472432,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.63433578968333,\n \
\ \"acc_stderr\": 0.004806316342709402,\n \"acc_norm\": 0.8286197968532165,\n\
\ \"acc_norm_stderr\": 0.0037607069750393053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n\
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509986,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509986\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603662,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.4691495265456508,\n\
\ \"mc2_stderr\": 0.014857248788144817\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-09-31.642289.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-09-31.642289.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-09-31.642289.parquet'
- config_name: results
data_files:
- split: 2023_10_11T16_09_31.642289
path:
- results_2023-10-11T16-09-31.642289.parquet
- split: latest
path:
- results_2023-10-11T16-09-31.642289.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench7](https://huggingface.co/Undi95/Mistral-11B-TestBench7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T16:09:31.642289](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench7/blob/main/results_2023-10-11T16-09-31.642289.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6399052867360159,
"acc_stderr": 0.03310704632621164,
"acc_norm": 0.6439213227226402,
"acc_norm_stderr": 0.03308447285363473,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.4691495265456508,
"mc2_stderr": 0.014857248788144817
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472432,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.63433578968333,
"acc_stderr": 0.004806316342709402,
"acc_norm": 0.8286197968532165,
"acc_norm_stderr": 0.0037607069750393053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603662,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.4691495265456508,
"mc2_stderr": 0.014857248788144817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
anaisk/v2_sinespacios | ---
dataset_info:
features:
- name: Sentence
dtype: string
- name: Audio
dtype: audio
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 314514171.93
num_examples: 9730
download_size: 357778902
dataset_size: 314514171.93
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "v2_sinespacios"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mb7419/career-guidance-reddit | ---
license: cc-by-4.0
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: created_utc
dtype: string
- name: url
dtype: string
- name: retrieved_on
dtype: string
- name: question_content
dtype: string
- name: dominant_topic
dtype: int64
- name: dominant_topic_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 32409618
num_examples: 13552
- name: validation
num_bytes: 6891421
num_examples: 2904
- name: test
num_bytes: 7021155
num_examples: 2905
download_size: 26693366
dataset_size: 46322194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
talha10/image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22842342.0
num_examples: 100
download_size: 22823708
dataset_size: 22842342.0
---
# Dataset Card for "image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-47000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 656878
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/MedMCQA | ---
license: apache-2.0
---
|
Olivacker/vitachivoice | ---
license: openrail
---
|
findzebra/case-reports | ---
license: cc-by-4.0
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
pretty_name: FindZebra case reports
---
# FindZebra case reports
A collection of 3344 case reports fetched from the PubMed API for the Fabry, Gaucher and Familial amyloid cardiomyopathy (FAC) diseases.
Articles are labelled using a text segmentation model described in "FindZebra online search delving into rare disease case reports using natural language processing". |
Kochanoskill/Xayoo | ---
license: openrail
---
|
jjzha/sayfullina | ---
license: unknown
language: en
---
This is the soft-skill dataset created by:
```
@inproceedings{sayfullina2018learning,
title={Learning representations for soft skill matching},
author={Sayfullina, Luiza and Malmi, Eric and Kannala, Juho},
booktitle={Analysis of Images, Social Networks and Texts: 7th International Conference, AIST 2018, Moscow, Russia, July 5--7, 2018, Revised Selected Papers 7},
pages={141--152},
year={2018},
organization={Springer}
}
```
There are no document delimiters. Data is split by user `jjzha`.
Number of samples (sentences):
- train: 3705
- dev: 1855
- test: 1851
Sources:
- Adzuna (UK)
Type of tags:
- B-SOFT
- I-SOFT
- O
Sample:
```
{
"idx": 1853,
"tokens": ["and", "sensitive", "when", "deal", "with", "customer", "be", "enthusiastic", "always", "eager", "to", "learn", "and", "develop", "knowledge", "and", "skill"],
"tags_skill": ["O", "O", "O", "O", "O", "O", "O", "B-SOFT", "I-SOFT", "I-SOFT", "I-SOFT", "I-SOFT", "O", "O", "O", "O", "O"]
}
``` |
liuyanchen1015/MULTI_VALUE_mrpc_no_gender_distinction | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 88142
num_examples: 332
- name: train
num_bytes: 216414
num_examples: 810
- name: validation
num_bytes: 25453
num_examples: 96
download_size: 224175
dataset_size: 330009
---
# Dataset Card for "MULTI_VALUE_mrpc_no_gender_distinction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_indef_one | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13564
num_examples: 162
- name: test
num_bytes: 10626
num_examples: 140
- name: train
num_bytes: 100355
num_examples: 1280
download_size: 61637
dataset_size: 124545
---
# Dataset Card for "MULTI_VALUE_cola_indef_one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Felladrin/ChatML-webGPT_x_dolly | ---
language:
- en
license: cc-by-sa-3.0
size_categories:
- 10K<n<100K
task_categories:
- question-answering
---
[starfishmedical/webGPT_x_dolly](https://huggingface.co/datasets/starfishmedical/webGPT_x_dolly) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
import random
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("starfishmedical/webGPT_x_dolly", split="train")
def format(columns):
instruction = columns["instruction"].strip()
input = columns["input"].strip()
assistant_message = columns["output"].strip()
if random.random() < 0.5:
user_message = f"Question:\n{instruction}\n\nContext:\n{input}"
else:
user_message = f"Context:\n{input}\n\nQuestion:\n{instruction}"
messages = [
{
"role": "user",
"content": user_message,
},
{
"role": "assistant",
"content": assistant_message,
},
]
return { "text": tokenizer.apply_chat_template(messages, tokenize=False) }
dataset.map(format).select_columns(['text']).to_parquet("train.parquet")
```
|
Astonzzh/summary_seq_label_balanced | ---
dataset_info:
features:
- name: id
dtype: string
- name: ids
sequence: string
- name: words
sequence: string
- name: labels
sequence: int64
- name: summary
dtype: string
- name: sentences
sequence: string
- name: sentence_labels
sequence: int64
splits:
- name: train
num_bytes: 9014992.927366104
num_examples: 7360
- name: test
num_bytes: 500969.0363169479
num_examples: 409
- name: validation
num_bytes: 500969.0363169479
num_examples: 409
download_size: 3867151
dataset_size: 10016931.0
---
# Dataset Card for "summary_seq_label_balanced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/vr_train_free_49 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6602904760
num_examples: 10000
download_size: 1036761082
dataset_size: 6602904760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DamarJati/Face-Mask-Detection | ---
language:
- en
pipeline_tag: image-classification
tags:
- art
- face mask
- mask
task_categories:
- image-classification
---
Original datasets https://www.kaggle.com/datasets/ashishjangra27/face-mask-12k-images-dataset |
roa7n/patched_test_p_40_f_SPOUT_m1_predictions | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: m1_preds
dtype: float32
splits:
- name: train
num_bytes: 484629878
num_examples: 1470999
download_size: 49491513
dataset_size: 484629878
---
# Dataset Card for "patched_test_p_40_f_SPOUT_m1_predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/sur_test | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1297540140
num_examples: 900000
download_size: 298907283
dataset_size: 1297540140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sur_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ikawrakow/validation-datasets-for-llama.cpp | ---
license: apache-2.0
---
This repository contains validation datasets for use with the `perplexity` tool from the `llama.cpp` project.
**Note:** [PR #5047](https://github.com/ggerganov/llama.cpp/pull/5047) is required to be able to use these datasets.
The simple program in `demo.cpp` shows how to read these files and can be used to combine two files into one.
The simple program in `convert.cpp` shows how to convert the data to JSON. For instance:
```
g++ -o convert convert.cpp
./convert arc-easy-validation.bin arc-easy-validation.json
``` |
matthh/gutenberg-poetry-corpus | ---
license: cc0-1.0
---
|
factored/fr_crawler_class | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': business analyst
'1': data analyst
'2': data engineer
'3': full stack
'4': data scientist
'5': software engineer
'6': devops engineer
'7': front end
'8': business intelligence analyst
'9': machine learning engineer
- name: text
dtype: string
splits:
- name: train
num_bytes: 393756835.62683624
num_examples: 2250902
- name: val
num_bytes: 49219648.18658188
num_examples: 281363
- name: test
num_bytes: 49219648.18658188
num_examples: 281363
download_size: 284157951
dataset_size: 492196132.0
---
# Dataset Card for "fr_crawler2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
matteopilotto/kratos | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 10082811.0
num_examples: 10
download_size: 10084661
dataset_size: 10082811.0
---
# Dataset Card for "kratos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T | ---
pretty_name: Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T20:19:42.566398](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T/blob/main/results_2023-12-29T20-19-42.566398.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.265691244274486,\n\
\ \"acc_stderr\": 0.031066770980303738,\n \"acc_norm\": 0.26755149869038447,\n\
\ \"acc_norm_stderr\": 0.031835502327294145,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n\
\ \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3046075085324232,\n \"acc_stderr\": 0.01344952210993249,\n\
\ \"acc_norm\": 0.3387372013651877,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4493128858793069,\n\
\ \"acc_stderr\": 0.00496407587012034,\n \"acc_norm\": 0.6030671181039634,\n\
\ \"acc_norm_stderr\": 0.004882619484166595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816503,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.11851851851851852,\n\
\ \"acc_stderr\": 0.027922050250639055,\n \"acc_norm\": 0.11851851851851852,\n\
\ \"acc_norm_stderr\": 0.027922050250639055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n\
\ \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.18421052631578946,\n\
\ \"acc_stderr\": 0.03646758875075566,\n \"acc_norm\": 0.18421052631578946,\n\
\ \"acc_norm_stderr\": 0.03646758875075566\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776578,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776578\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.0298575156733864,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.0298575156733864\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807096,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807096\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.014173044098303654,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.014173044098303654\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2242503259452412,\n\
\ \"acc_stderr\": 0.010652615824906172,\n \"acc_norm\": 0.2242503259452412,\n\
\ \"acc_norm_stderr\": 0.010652615824906172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256386,\n\
\ \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256386\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.03446296217088426,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.03446296217088426\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n\
\ \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5951065509076559,\n \"acc_stderr\": 0.013795927003124934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \
\ \"acc_stderr\": 0.0032820559171369596\n }\n}\n```"
repo_url: https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T20-19-42.566398.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- '**/details_harness|winogrande|5_2023-12-29T20-19-42.566398.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T20-19-42.566398.parquet'
- config_name: results
data_files:
- split: 2023_12_29T20_19_42.566398
path:
- results_2023-12-29T20-19-42.566398.parquet
- split: latest
path:
- results_2023-12-29T20-19-42.566398.parquet
---
# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:19:42.566398](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T/blob/main/results_2023-12-29T20-19-42.566398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.265691244274486,
"acc_stderr": 0.031066770980303738,
"acc_norm": 0.26755149869038447,
"acc_norm_stderr": 0.031835502327294145,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|arc:challenge|25": {
"acc": 0.3046075085324232,
"acc_stderr": 0.01344952210993249,
"acc_norm": 0.3387372013651877,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.4493128858793069,
"acc_stderr": 0.00496407587012034,
"acc_norm": 0.6030671181039634,
"acc_norm_stderr": 0.004882619484166595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816503,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.11851851851851852,
"acc_stderr": 0.027922050250639055,
"acc_norm": 0.11851851851851852,
"acc_norm_stderr": 0.027922050250639055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.03646758875075566,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.03646758875075566
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776578,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776578
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.0298575156733864,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.0298575156733864
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814565,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814565
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807096,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303654,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303654
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2242503259452412,
"acc_stderr": 0.010652615824906172,
"acc_norm": 0.2242503259452412,
"acc_norm_stderr": 0.010652615824906172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256386,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256386
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.03446296217088426,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.03446296217088426
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|winogrande|5": {
"acc": 0.5951065509076559,
"acc_stderr": 0.013795927003124934
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369596
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Tadorne/amendments | ---
language:
- en
license: eupl-1.1
pretty_name: Amendments EP - Legislature 7 & 8
configs:
- config_name: ALDE
data_files: alde.jsonl.gz
- config_name: ECR
data_files: ecr.jsonl.gz
- config_name: EFD
data_files: efd.jsonl.gz
- config_name: ENF
data_files: enf.jsonl.gz
- config_name: EPP
data_files: epp.jsonl.gz
- config_name: EUL
data_files: eul.jsonl.gz
- config_name: GEFA
data_files: gefa.jsonl.gz
- config_name: ID
data_files: id.jsonl.gz
- config_name: NA
data_files: na.jsonl.gz
- config_name: RENEW
data_files: renew.jsonl.gz
- config_name: SD
data_files: sd.jsonl.gz
---
# 🇪🇺 🗳️ European Parliament Amendments : Legislature 7 & 8
Source: https://zenodo.org/record/3757714
|
Sunbird/salt-multispeaker-lug | ---
dataset_info:
features:
- name: ids
dtype: string
- name: texts
dtype: string
- name: audios
sequence: float32
- name: audio_languages
dtype: string
- name: are_studio
dtype: bool
- name: speaker_ids
dtype: string
- name: sample_rates
dtype: int64
splits:
- name: train
num_bytes: 2000645994
num_examples: 5016
- name: dev
num_bytes: 38741356
num_examples: 103
- name: test
num_bytes: 39746693
num_examples: 97
download_size: 1016122402
dataset_size: 2079134043
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
tartuNLP/finno-ugric-train | ---
license: cc-by-4.0
---
|
tasksource/cladder | ---
license: mit
language:
- en
---
https://github.com/causalNLP/cladder |
RahulRaman/counting-object-sd-dataset4-clean4 | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 780165241.0
num_examples: 496
download_size: 297832459
dataset_size: 780165241.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jalilov/document-segment | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 105189330.0
num_examples: 100
download_size: 0
dataset_size: 105189330.0
---
# Dataset Card for "document-segment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maghwa/OpenHermes-2-AR-10K-14-360k-370k | ---
dataset_info:
features:
- name: title
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: topic
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: views
dtype: float64
- name: model
dtype: 'null'
- name: conversations
dtype: string
- name: language
dtype: 'null'
- name: hash
dtype: 'null'
- name: category
dtype: 'null'
- name: idx
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: id
dtype: 'null'
- name: system_prompt
dtype: 'null'
splits:
- name: train
num_bytes: 30826852
num_examples: 10001
download_size: 14281452
dataset_size: 30826852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lipa1919/wikidumps-oscar-pl | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 23435628364
num_examples: 17016858
download_size: 15087497727
dataset_size: 23435628364
---
# Dataset Card for "wikidumps-oscar-pl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Praghxx/Pragh | ---
license: openrail
---
|
alvations/c4p0-v2-en-fr | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
- name: dataset
dtype: string
- name: source_lang
dtype: string
- name: target_lang
dtype: string
splits:
- name: train
num_bytes: 9739130
num_examples: 7510
download_size: 4040042
dataset_size: 9739130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pn_summary | ---
annotations_creators:
- found
language_creators:
- found
language:
- fa
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
- text-classification
task_ids:
- news-articles-summarization
- news-articles-headline-generation
- text-simplification
- topic-classification
paperswithcode_id: pn-summary
pretty_name: Persian News Summary (PnSummary)
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: article
dtype: string
- name: summary
dtype: string
- name: category
dtype:
class_label:
names:
'0': Economy
'1': Roads-Urban
'2': Banking-Insurance
'3': Agriculture
'4': International
'5': Oil-Energy
'6': Industry
'7': Transportation
'8': Science-Technology
'9': Local
'10': Sports
'11': Politics
'12': Art-Culture
'13': Society
'14': Health
'15': Research
'16': Education-University
'17': Tourism
- name: categories
dtype: string
- name: network
dtype:
class_label:
names:
'0': Tahlilbazaar
'1': Imna
'2': Shana
'3': Mehr
'4': Irna
'5': Khabaronline
- name: link
dtype: string
config_name: 1.0.0
splits:
- name: train
num_bytes: 309436493
num_examples: 82022
- name: validation
num_bytes: 21311817
num_examples: 5592
- name: test
num_bytes: 20936820
num_examples: 5593
download_size: 89591141
dataset_size: 351685130
---
# Dataset Card for Persian News Summary (pn_summary)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/hooshvare/pn-summary/
- **Paper:** https://arxiv.org/abs/2012.11204
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [Mehrdad Farahani](mailto:m3hrdadfphi@gmail.com)
### Dataset Summary
A well-structured summarization dataset for the Persian language consists of 93,207 records. It is prepared for Abstractive/Extractive tasks (like cnn_dailymail for English). It can also be used in other scopes like Text Generation, Title Generation, and News Category Classification.
It is imperative to consider that the newlines were replaced with the `[n]` symbol. Please interpret them into normal newlines (for ex. `t.replace("[n]", "\n")`) and then use them for your purposes.
### Supported Tasks and Leaderboards
The dataset is prepared for Abstractive/Extractive summarization tasks (like cnn_dailymail for English). It can also be used in other scopes like Text Generation, Title Generation, and News Category Classification.
### Languages
The dataset covers Persian mostly and somewhere a combination with English.
## Dataset Structure
### Data Instances
A record consists of 8 features:
```python
record = ['id','title', 'article', 'summary', 'category', 'categories', 'network', 'link']
```
In the following, you can see an example of `pn_summmary`.
```json
{
"article": "به گزارش شانا، علی کاردر امروز (۲۷ دی ماه) در مراسم تودیع محسن قمصری، مدیر سابق امور بین الملل شرکت ملی نفت ایران و معارفه سعید خوشرو، مدیر جدید امور بین الملل این شرکت، گفت: مدیریت امور بین\u200eالملل به عنوان یکی از تاثیرگذارترین مدیریت\u200cهای شرکت ملی نفت ایران در دوران تحریم\u200cهای ظالمانه غرب علیه کشورمان بسیار هوشمندانه عمل کرد و ما توانستیم به خوبی از عهده تحریم\u200cها برآییم. [n] وی افزود: مجموعه امور بین الملل در همه دوران\u200cها با سختی\u200cها و مشکلات بسیاری مواجه بوده است، به ویژه در دوره اخیر به دلیل مسائل پیرامون تحریم وظیفه سنگینی بر عهده داشت که با تدبیر مدیریت خوب این مجموعه سربلند از آن بیرون آمد. [n] کاردر با قدردانی از زحمات محسن قمصری، به سلامت مدیریت امور بین الملل این شرکت اشاره کرد و افزود: محوریت کار مدیریت اموربین الملل سلامت مالی بوده است. [n] وی بر ضرورت نهادینه سازی جوانگرایی در مدیریت شرکت ملی نفت ایران تاکید کرد و گفت: مدیریت امور بین الملل در پرورش نیروهای زبده و کارآزموده آنچنان قوی عملکرده است که برای انتخاب مدیر جدید مشکلی وجود نداشت. [n] کاردر، حرفه\u200eای\u200eگری و کار استاندارد را از ویژگی\u200cهای مدیران این مدیریت برشمرد و گفت: نگاه جامع، خلاقیت و نوآوری و بکارگیری نیروهای جوان باید همچنان مد نظر مدیریت جدید امور بین الملل شرکت ملی نفت ایران باشد.",
"categories": "نفت",
"category": 5,
"id": "738e296491f8b24c5aa63e9829fd249fb4428a66",
"link": "https://www.shana.ir/news/275284/%D9%85%D8%AF%DB%8C%D8%B1%DB%8C%D8%AA-%D9%81%D8%B1%D9%88%D8%B4-%D9%86%D9%81%D8%AA-%D8%AF%D8%B1-%D8%AF%D9%88%D8%B1%D8%A7%D9%86-%D8%AA%D8%AD%D8%B1%DB%8C%D9%85-%D9%87%D9%88%D8%B4%D9%85%D9%86%D8%AF%D8%A7%D9%86%D9%87-%D8%B9%D9%85%D9%84-%DA%A9%D8%B1%D8%AF",
"network": 2,
"summary": "مدیرعامل شرکت ملی نفت، عملکرد مدیریت امور بین\u200eالملل این شرکت را در دوران تحریم بسیار هوشمندانه خواند و گفت: امور بین الملل در دوران پس از تحریم\u200eها نیز می\u200cتواند نقش بزرگی در تسریع روند توسعه داشته باشد.",
"title": "مدیریت فروش نفت در دوران تحریم هوشمندانه عمل کرد"
}
```
### Data Fields
- `id (string)`: ID of the news.
- `title (string)`: The title of the news.
- `article (string)`: The article of the news.
- `summary (string)`: The summary of the news.
- `category (int)`: The category of news in English (index of categories), including `Economy`, `Roads-Urban`, `Banking-Insurance`, `Agriculture`, `International`, `Oil-Energy`, `Industry`, `Transportation`, `Science-Technology`, `Local`, `Sports`, `Politics`, `Art-Culture`, `Society`, `Health`, `Research`, `Education-University`, `Tourism`.
- `categories (string)`: The category and sub-category of the news in Persian.
- `network (int)`: The news agency name (index of news agencies), including `Tahlilbazaar`, `Imna`, `Shana`, `Mehr`, `Irna`, `Khabaronline`.
- `link (string)`: The link of the news.
The category in English includes 18 different article categories from economy to tourism.
```bash
Economy, Roads-Urban, Banking-Insurance, Agriculture, International, Oil-Energy, Industry, Transportation, Science-Technology, Local, Sports, Politics, Art-Culture, Society, Health, Research, Education-University, Tourism
```
### Data Splits
Training (82,022 records, 8 features), validation (5,592 records, 8 features), and test split (5,593 records and 8 features).
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
The dataset comprises numerous articles of various categories that have been crawled from six news agency websites (Tahlilbazaar, Imna, Shana, Mehr, Irna, and Khabaronline).
### Annotations
#### Annotation process
Each record (article) includes the long original text as well as a human-generated summary. The total number of cleaned articles is 93,207 (from 200,000 crawled articles).
#### Who are the annotators?
The dataset was organized by [Mehrdad Farahani](https://github.com/m3hrdadfi), [Mohammad Gharachorloo](https://github.com/baarsaam) and [Mohammad Manthouri](https://github.com/mmanthouri) for this paper [Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization](https://arxiv.org/abs/2012.11204)
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was curated by [Mehrdad Farahani](https://github.com/m3hrdadfi), [Mohammad Gharachorloo](https://github.com/baarsaam) and [Mohammad Manthouri](https://github.com/mmanthouri).
### Licensing Information
This dataset is licensed under MIT License.
### Citation Information
```bibtex
@article{pnSummary,
title={Leveraging ParsBERT and Pretrained mT5 for Persian Abstractive Text Summarization},
author={Mehrdad Farahani, Mohammad Gharachorloo, Mohammad Manthouri},
year={2020},
eprint={2012.11204},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@m3hrdadfi](https://github.com/m3hrdadfi) for adding this dataset. |
result-kand2-sdxl-wuerst-karlo/103deca7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 210
num_examples: 10
download_size: 1367
dataset_size: 210
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "103deca7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
frncscp/patacon-730 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Patacon-False
'1': Patacon-True
splits:
- name: train
num_bytes: 114865007.0
num_examples: 874
- name: validation
num_bytes: 18290064.0
num_examples: 143
- name: test
num_bytes: 59447780.0
num_examples: 442
download_size: 192218294
dataset_size: 192602851.0
---
# Dataset Card for "patacon-730"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel1322/jeimao | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_115 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1459046404.0
num_examples: 286537
download_size: 1479817172
dataset_size: 1459046404.0
---
# Dataset Card for "chunk_115"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abideen__AlphaMonarch-laser | ---
pretty_name: Evaluation run of abideen/AlphaMonarch-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/AlphaMonarch-laser](https://huggingface.co/abideen/AlphaMonarch-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__AlphaMonarch-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T21:42:42.439764](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__AlphaMonarch-laser/blob/main/results_2024-02-20T21-42-42.439764.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.650164107244002,\n\
\ \"acc_stderr\": 0.0322436754646661,\n \"acc_norm\": 0.6499808751496329,\n\
\ \"acc_norm_stderr\": 0.03291444175556814,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7790480256702841,\n\
\ \"mc2_stderr\": 0.013750619152726335\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725225,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7180840470025891,\n\
\ \"acc_stderr\": 0.004490130691020433,\n \"acc_norm\": 0.8920533758215495,\n\
\ \"acc_norm_stderr\": 0.0030967879582714177\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n\
\ \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n\
\ \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n\
\ \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n\
\ \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n\
\ \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4804432855280313,\n\
\ \"acc_stderr\": 0.012760464028289299,\n \"acc_norm\": 0.4804432855280313,\n\
\ \"acc_norm_stderr\": 0.012760464028289299\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7790480256702841,\n\
\ \"mc2_stderr\": 0.013750619152726335\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750038\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \
\ \"acc_stderr\": 0.012979892496598287\n }\n}\n```"
repo_url: https://huggingface.co/abideen/AlphaMonarch-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|arc:challenge|25_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|gsm8k|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hellaswag|10_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T21-42-42.439764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T21-42-42.439764.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- '**/details_harness|winogrande|5_2024-02-20T21-42-42.439764.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T21-42-42.439764.parquet'
- config_name: results
data_files:
- split: 2024_02_20T21_42_42.439764
path:
- results_2024-02-20T21-42-42.439764.parquet
- split: latest
path:
- results_2024-02-20T21-42-42.439764.parquet
---
# Dataset Card for Evaluation run of abideen/AlphaMonarch-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/AlphaMonarch-laser](https://huggingface.co/abideen/AlphaMonarch-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__AlphaMonarch-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T21:42:42.439764](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__AlphaMonarch-laser/blob/main/results_2024-02-20T21-42-42.439764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.650164107244002,
"acc_stderr": 0.0322436754646661,
"acc_norm": 0.6499808751496329,
"acc_norm_stderr": 0.03291444175556814,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7790480256702841,
"mc2_stderr": 0.013750619152726335
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7180840470025891,
"acc_stderr": 0.004490130691020433,
"acc_norm": 0.8920533758215495,
"acc_norm_stderr": 0.0030967879582714177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4804432855280313,
"acc_stderr": 0.012760464028289299,
"acc_norm": 0.4804432855280313,
"acc_norm_stderr": 0.012760464028289299
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7790480256702841,
"mc2_stderr": 0.013750619152726335
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750038
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.012979892496598287
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wucng/flower_photos_nc_5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': daisy
'1': dandelion
'2': roses
'3': sunflowers
'4': tulips
splits:
- name: train
num_bytes: 158078551.188
num_examples: 2934
- name: test
num_bytes: 46887697.0
num_examples: 736
download_size: 231236504
dataset_size: 204966248.188
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Jayanthini/Codegen | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: train
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1004122
num_examples: 50
download_size: 332831
dataset_size: 1004122
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvations/dslml24-jelly-submission-fr | ---
dataset_info:
- config_name: dev
features:
- name: text
dtype: string
- name: label
dtype: string
- name: prediction_oneshot
dtype: string
- name: response_oneshot
list:
- name: generated_text
dtype: string
- name: dataset
dtype: string
- name: split
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 58550850
num_examples: 17090
download_size: 12505629
dataset_size: 58550850
- config_name: test
features:
- name: text
dtype: string
- name: prediction_oneshot
dtype: string
- name: response_oneshot
list:
- name: generated_text
dtype: string
- name: dataset
dtype: string
- name: split
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 42603749
num_examples: 12000
download_size: 9801222
dataset_size: 42603749
configs:
- config_name: dev
data_files:
- split: train
path: dev/train-*
- config_name: test
data_files:
- split: train
path: test/train-*
---
|
arianhosseini/summ_dpo1b1_ngen10_max_2ndmax | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 36353598
num_examples: 20000
download_size: 22068425
dataset_size: 36353598
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Eduardovco/Garnet | ---
license: openrail
---
|
LawChat-tw/SFT | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 11724495
num_examples: 11798
download_size: 6505304
dataset_size: 11724495
---
# Dataset Card for "SFT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cohere/miracl-th-corpus-22-12 | ---
annotations_creators:
- expert-generated
language:
- th
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (th) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-th-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-th-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-th-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-th-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-th-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-th-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-th-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-th-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-th-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-th-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-th-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-th-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
tyoung560/ai-assist-logs | ---
license: unknown
---
|
McSpicyWithMilo/target-elements-0.1split-new-move | ---
dataset_info:
features:
- name: target_element
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 11766.6
num_examples: 90
- name: test
num_bytes: 1307.4
num_examples: 10
download_size: 9841
dataset_size: 13074.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-elements-0.1split-new-move"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AstroAlpha/Personal_Dataset | ---
license: mit
task_categories:
- conversational
language:
- en
tags:
- not-for-all-audiences
pretty_name: Savita
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sleoruiz/disc_cla_sexta-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 15176429
num_examples: 7591
download_size: 7564523
dataset_size: 15176429
---
# Dataset Card for "disc_cla_sexta-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingface/autotrain-data-yrsq-dnj7-jghjk2 | Invalid username or password. |
AlaGrine/codeparrot-sklearn | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
splits:
- name: train
num_bytes: 3147402833.3951
num_examples: 241075
- name: valid
num_bytes: 17472318.29500301
num_examples: 1312
download_size: 966099631
dataset_size: 3164875151.690103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
license: mit
task_categories:
- conversational
---
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
splits:
- name: train
num_bytes: 3147402833.3951
num_examples: 241075
- name: valid
num_bytes: 17472318.29500301
num_examples: 1312
download_size: 966099631
dataset_size: 3164875151.690103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
license: mit |
a-moron/aeroponics | ---
language:
- en
pretty_name: Chunked Papers for Aeroponics
---
This dataset contains chunked extracts (of ~300 tokens) from papers related to aeroponic agriculture. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.